Google TPU chips to be made available to external developers
In a blog post, Google has revealed that its AI accelerator chips have now been made available in a limited number to external customers looking to develop machine learning technologies on the Google Cloud Platform.
The Cloud tensor processing units (TPUs) are Google’s in-house hardware accelerators that are designed to speed up machine learning (ML) programmes that have been developed using Google’s TensorFlow software.
Each TPU has been built with four AI-based application specific integrated circuits (ASICs), providing up to 180 teraflops of performance and 64GB of memory in a single chip, built specifically to be implemented in ML supercomputers.
“Since working with Google Cloud TPUs, we’ve been extremely impressed with their speed—what could normally take days can now take hours,” said Anatha Kancherla, Head of Software, Self-Driving Level 5, Lyft.
“Deep learning is fast becoming the backbone of the software running self-driving cars. The results get better with more data, and there are major breakthroughs coming in algorithms every week.”
The move comes less than two years after the company launched its TPU project back in 2016.
- Oracle NetSuite’s SuiteWorld 2022 - Day 3 HighlightsData & Data Analytics
- Global PayEX optimises working capital via AR/AP automation
- Time to recognise what chatbots say about company cultureAI & Machine Learning
- ICYMI: Sustainability key to the cloud and can we trust AI?AI & Machine Learning