Google on Tuesday announced that it has developed a third generation of its special chips for artificial intelligence.
The new Tensor Processing Units (TPUs) will help Google improve applications that use artificial intelligence to do things like recognize words people are saying in audio recordings, spot objects in photos and videos, and pick up underlying emotions in written text. As such, the chips represent an alternative to Nvidia‘s graphics processing units.
Also, if the new version is anything like its predecessor, it will also become accessible to third-party developers through Google’s public cloud service, which could help Google compete with Amazon and Microsoft. Earlier this week Microsoft announced the early availability of special chips in its Azure cloud.
Pichai boasted about the vast computing power that’s possible when people use large fleets of these third-generation TPUs.
“Each of these pods is now eight times more powerful than last year’s version — well over 100 petaflops,” he said. For context, a box containing 16 of Nvidia’s latest GPUs offers two petaflops of computing power.
The chips are liquid-cooled — a feature that’s sometimes used for high-performance computing chips or some performance-oriented chips in people’s PCs.
Last year’s version is already showing good results. Test results posted in recent months suggest that the second-generation TPUs could deliver better performance than existing options with GPUs in certain scenarios, although the TPUs do have certain limitations, like lacking support for the Facebook-backed PyTorch AI software framework. The PyTorch open-source community has been working to change that.
Google first announced the TPU initiative in 2016.
Google launches the third version of its A.I. chips, an alternative to Nvidia's