Google TPU chips to be made available to external developers
In a blog post, Google has revealed that its AI accelerator chips have now been made available in a limited number to external customers looking to develop machine learning technologies on the Google Cloud Platform.
The Cloud tensor processing units (TPUs) are Google’s in-house hardware accelerators that are designed to speed up machine learning (ML) programmes that have been developed using Google’s TensorFlow software.
See also:
-
Google welcomes 2,000 HTC smartphone specialists in completion of $1.1bn deal
-
Google, MobileIron partner to launch new cloud services marketplace
Each TPU has been built with four AI-based application specific integrated circuits (ASICs), providing up to 180 teraflops of performance and 64GB of memory in a single chip, built specifically to be implemented in ML supercomputers.
“Since working with Google Cloud TPUs, we’ve been extremely impressed with their speed—what could normally take days can now take hours,” said Anatha Kancherla, Head of Software, Self-Driving Level 5, Lyft.
“Deep learning is fast becoming the backbone of the software running self-driving cars. The results get better with more data, and there are major breakthroughs coming in algorithms every week.”
The move comes less than two years after the company launched its TPU project back in 2016.
- Red Hat Steps Into AI Optimisation with Neural Magic DealAI & Machine Learning
- SAVE THE DATE - The Global Tech & AI AwardsDigital Transformation
- Tech & AI LIVE Pre-Event Interview with Kyndryl US CTOAI & Machine Learning
- Accenture Report: How AI Threatens Corporate Net Zero GoalsDigital Transformation