At Google Cloud Next ‘26, the company unveiled two AI chips, each tailored specifically for training and inference.
Interesting Engineering on MSN
Google launches TPU 8 chips with 3x power to speed AI training, cut cloud costs
Google has unveiled its eighth-generation Tensor Processing Units, introducing two custom AI chips designed ...
Most of the companies that have fully committed to building AI models are gobbling up every Nvidia AI accelerator they can ...
Google is packing ample amounts of static random access memory into a dedicated chip for running artificial intelligence ...
Google has unveiled its eighth-generation Tensor Processing Units, splitting AI training and inference into separate chips — TPU 8t and TPU 8i — to boost efficiency and cut costs. The move positions ...
At Google Cloud Next 26, Google Cloud announced a strategic shift in its AI hardware approach by introducing two distinct ...
Google's unveiling of its eighth-generation tensor processing unit (TPU) at Cloud Next 2026 is expected to drive the next ...
Google's newest TPUs are faster and cheaper than the previous versions. But the company is still embracing Nvidia in its ...
Google’s new processors target massive model training and the emerging AI agent economy, offering distinct builds for both ...
Alphabet Inc.’s Google Cloud division unveiled the latest generation of its tensor processing unit, or TPU, a homegrown chip that’s designed to make AI computing services faster and more efficient.
The search giant is vying for a bigger slice of the AI pie.
Alphabet Inc.'s Google is reportedly advancing its AI hardware push by working with Marvell Technology on two new chips ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results