Marked-up Mac minis flood eBay amid shortages driven by AI

What it is
The Mac mini shortage is a supply crunch caused by AI hobbyists and developers buying the desktop en masse for local model inference. Think of it as a GPU rush, but for Apple Silicon — people discovered that the M2/M3 chips can run surprisingly large language models locally, and the mini offers the cheapest entry point. The unified memory architecture means the GPU and CPU share RAM, so a 64GB mini can handle models that would need a dedicated GPU elsewhere.
Why it matters
If you're running local AI models, this signals two things: Mac hardware is now genuinely competitive for inference (not just training), and consumer-grade devices are becoming viable alternatives to cloud APIs. You might want to secure a mini now before prices climb further, or explore alternatives like used Mac Studios. The resale frenzy also shows how fast AI infrastructure needs are bleeding into consumer markets — expect more shortages as open-source models improve.
Key details
- •Mac mini sells out on Apple's site and major retailers; eBay listings appear with 20-30% markups over $599-$1,299 MSRP
- •llama.cpp project releases near-daily builds optimized for Apple Silicon, with specific KleidiAI-enabled variants for M-series chips
- •Unified memory architecture lets a 64GB Mac mini run models that typically require discrete GPUs with similar VRAM
- •The M2 and M3 chips deliver strong inference performance-per-watt, making the mini appealing for always-on local AI setups
- •Shortage coincides with rising interest in running Llama 3.3, DeepSeek, and other open models on-device for privacy and cost savings