For a while now GPUs have been the focus because of model training.
But, Arm just put out a data center CPU with Meta and said it gets about 2x performance per rack vs traditional setups.Nvidia AMD and Intel are all pushing CPUs again too.
Anybody heard about this? What kind of opportunity does this create for us?
What if AI demand shifts away from just GPUs
byu/TrueValueInsights ininvesting
Posted by TrueValueInsights
5 Comments
arm’s been cooking something for a while now and that meta partnership makes sense given their scale. the cpu angle is interesting because training is just one piece – inference and edge deployment need different hardware altogether
been watching this space since my gaming laptop collection got me thinking about chip efficiency vs raw power. might be worth looking at companies that make the other components these data centers need, not just the flashy gpu names everyone talks about
The main point here is that ARM is providing multicore high energy efficiency cpu to handle some of the lower calculation intensive tasks. It’s not really design to replace GPUs for larger model inference but is useful to execute some of the smaller ones and some of the agentic tasks. It’s still early to see how well they’ll integrate with current AI workflows. Nvidia has done some ARM cpu works in their grace hopper system and on but mainly use for task assignments during training and inference so it’s not a 1to1 use case against this iteration of ARM “AGI” cpus.
Well yeah it is indeed shifting so the space evolves and all the companies with it, along with development. Go watch Jensen Huang’s keynote speech at GTC 2026 just last week. Nvidia is preparing and positioning itself exactly for that. TPU’s, CPU’s, NPU’s, ASIC’s and focusing on inferencing are all being integrated into AI and combinations of all the above. Each piece of hardware and design has its strengths and weaknesses with no one being a one size fit all depending on what it’s exactly being used for. The ground work is being laid for the next iteration of AI
It already has. ASICs are heavily in use as are TPUs. Some forms of AI do better with CPUs or need it for assistance – for example, when an LLM needs to ingest a web site for information, it’s better off with a CPU doing the work and feeding the data into it. But GPUs aren’t going anywhere anytime soon – for general purpose AIs they are the weaon of choice for many applications. The big companies buying fewer just means that smaller companies will have the opportunity to buy them.
storage and ram are also very expensive and they are as important as GPU. CPUs and motherboards prices are still not inflated, but I don’t think it’s going to last long especially for CPUs.