bold
Apple M5, the pin to burst the bubble? Last week, Bloomberg published a simple but informative diagram (https://lnkd.in/ehXGP4WA) showing the interdependence between the AI companies: NVIDIA, Microsoft, OpenAI, AMD, Oracle, Intel, XAI, and others. Notably absent was Apple. This is not because it doesn’t have an AI story, but because it is firmly outside of the AI infrastructure bubble. You don’t have to look far to read about this bubble; the Financial Times published an article earlier today titled “The AI bubble is a bigger global economic threat than Trump’s tariffs” (https://on.ft.com/477IKAu). Why is Apple different, though? Apple is making and selling chips with incredible performance (better per watt than NVIDIA) to run locally, in your pocket, on your lap, or at your desktop. The new M5 claims a 3.5x AI performance boost, and the M4 was already fast. Apple is not crearing its own proprietary AI models (LLMs), you can run almost any open-source model on every device, American, French, Chinese, or any other, from wherever you like. They all run beautifully fast on your personal Apple-Silicon device or company servers. The future of AI is small LLMs linked together in agentic networks, not the large proprietary oligarchs’ GPTs running on billions of dollars of cloud infrastructure, which require billions of dollars in power and billions of litres/gallons of water to cool inefficient GPUs. The incumbents have no other path, I suggest, it’s an AI Ponzi scheme. Too much has been invested, and the only way forward is to keep pretending it’s the only way forward. Almost everything we use AI for will run on your laptop, privately and for free. If/when the AI bubble bursts, as with the glitch earlier this year, it will be a realisation that AI doesn’t need (loaned) money spent in this way. Spend 10% of the AI infrastructure budget on R&D or education, and you’ll have a much better return in the long run. Back to Apple and the new M5. When a laptop can do what supposedly requires millions in infrastructure, the emperor has no clothes. Apple is by no means unique in this space, but it is clearly leading it. NVIDIA has obviously seen this and released the $4,000 DGX Spark Desktop yesterday (15th Oct). I was going to order one until I saw the benchmarks. It seems no more than a souped-up Raspberry Pi, not even close to my one-year-old MacBook Pro. A shame as I like new gadgets and I already have loads of Raspberry Pis. I am patiently waiting for the M5 Max or Ultra. Wean yourselves off ChatGPT. With open-source models running locally, your data never leaves your device, your costs are fixed, your latency is near zero, and your privacy is absolute, this scales to enterprise with private cloud too.
Is this comparing oranges/oranges, in that the LLM bonanza is based on the cost of training, rather than the cost per query? There was a comment, in the Economist I think, casting doubt on NVIDIA's growth plan for US as it is based on a faster growth in electricity generation businesses than the leaders of electricity generators have experience of.
the future is local models for sure cloud is good for charging rent tho, and the digital world is awash with landlords
This “Almost everything we use AI for will run on your laptop, privately and for free.” Is on the nose…we’re already seeing the DeepSeek reasoning models run on domestic devices without external processing requirements
Ok, we're in a bubble, now what? Rinse, repeat, lather--global economics since the 12th century. Investors are looking for growth stocks. Apple remains the juggernaut in mobile, but their crown jewel is not growing as the iPhone user base has plateaued. I'm only echoing what the same analysts that follow are saying about Apple. BTW, I'm sure Apple is studying the various use cases for inference applications. To return the 20% YoY growth, it will be interesting to see what they come up with.
I don't know! Good comments