Blog

Apr 5, 2023

AI chip race: Google says its Tensor chips compute faster than Nvidia’s A100

Posted by in categories: biotech/medical, business, robotics/AI, space, supercomputing

It also says that it has a healthy pipeline for chips in the future.

Search engine giant Google has claimed that the supercomputers it uses to develop its artificial intelligence (AI) models are faster and more energy efficient than Nvidia Corporation’s. While processing power for most companies delving into the AI space comes from Nvidia’s chips, Google uses a custom chip called Tensor Processing Unit (TPU).

Google announced its Tensor chips during the peak of the COVID-19 pandemic when businesses from electronics to automotive faced the pinch of chip shortage.


AI-designed chips to further AI development

Interesting Engineering reported in 2021 that Google used AI to design its TPUs. Google claimed that the design process was completed in just six hours using AI compared to the months humans spend designing chips.

For most things associated with AI these days, product iterations occur rapidly, and the TPU is currently in its fourth generation. As Microsoft stitched together chips to power OpenAI’s research requirement, Google also put together 4,000 TPUs to make its supercomputer.

Comments are closed.