Jul 29, 2020

How is AI helping the response to COVID-19?

AI
Technology
Kayleigh Shooter
2 min
.
We take a look at how artificial intelligence (AI) is aiding the ongoing fight against the coronavirus...

 As the world struggles on with the coronavirus pandemic, every ounce of technology is being utilised to try and fight off the threat and restore normality. 

AI is playing an integral role in being able to understand the pandemic, machine learning enables computers to be able to mimic human intelligence and take in large quantities of data to quickly identify patterns.

Machine learning has been implemented for many purposes; identifying how the disease spreads, scaling customer communications and speeding up research.

Every business, whether it is small or global, is finding new ways to operate efficiently and meet the needs of both employees and customers as social distancing measures remain in place. 

One example of how machine learning is being used is chat bots for contactless COVID-19 screenings and to answer questions for any concerned members of the public.

It also helps researchers discover how the coronavirus spreads by analysing a large volume of data, it also helps to predict the spread and act as an early warning system.

It may be too early to tell the extent of which artificial intelligence aided the fight against the coronavirus however we know that it has played a major role in identifying trends in the virus to decipher how it spreads. 

What is machine learning?:

Simply put, machine learning is the study of computer algorithms that aim to improve themselves through experience. 

Machine learning powers some of the tools that we use day to day that we may not even know about, for example, Netflix uses machine learning, in addition to Twitter, Spotify and Google. Less shockingly, Siri and Alexa both utilise machine learning. 

Each platform collects as much data about you as they can for example, what genres you like to listen to/watch, what links you click and even which statuses you interact with and react to most.

What is artificial intelligence?

Artificial intelligence is the simulation of human intelligence in machines that are programmed to think like humans and also mimic human actions. Artificial intelligence technology has many applications, the uses of it are endless. Such as self-driving cars and even computers that play games like chess. 

Artificial intelligence is also used in banking to help detect fraud and flag unusual activity. 

Share article

Jun 11, 2021

Google AI Designs Next-Gen Chips In Under 6 Hours

Google
AI
Manufacturing
semiconductor
3 min
Google AI’s deep reinforcement learning algorithms can optimise chip floor plans exponentially faster than their human counterparts

In a Google-Nature paper published on Wednesday, the company announced that AI will be able to design chips in less than six hours. Humans currently take months to design and layout the intricate chip wiring. Although the tech giant has been working in silence on the technology for years, this is the first time that AI-optimised chips have hit the mainstream—and that the company will sell the result as a commercial product. 

 

“Our method has been used in production to design the next generation of Google TPU (tensor processing unit chips)”, the paper’s authors, Azalea Mirhoseini and Anna Goldie wrote. The TPU v4 chips are the fastest Google system ever launched. “If you’re trying to train a large AI/ML system, and you’re using Google’s TensorFlow, this will be a big deal”, said Jack Gold, President and Principal Analyst at J.Gold Associates

 

Training the Algorithm 

In a process called reinforcement learning, Google engineers used a set of 10,000 chip floor plans to train the AI. Each example chip was assigned a score of sorts based on its efficiency and power usage, which the algorithm then used to distinguish between “good” and “bad” layouts. The more layouts it examines, the better it can generate versions of its own. 

 

Designing floor plans, or the optimal layouts for a chip’s sub-systems, takes intense human effort. Yet floorplanning is similar to an elaborate game. It has rules, patterns, and logic. In fact, just like chess or Go, it’s the ideal task for machine learning. Machines, after all, don’t follow the same constraints or in-built conditions that humans do; they follow logic, not preconception of what a chip should look like. And this has allowed AI to optimise the latest chips in a way we never could. 

 

As a result, AI-generated layouts look quite different to what a human would design. Instead of being neat and ordered, they look slightly more haphazard. Blurred photos of the carefully guarded chip designs show a slightly more chaotic wiring layout—but no one is questioning its efficiency. In fact, Google is starting to evaluate how it could use AI in architecture exploration and other cognitively intense tasks. 

 

Major Implications for the Semiconductor Sector 

Part of what’s impressive about Google’s breakthrough is that it could throw Moore’s Law, the axion that the number of transistors on a chip doubles every five years, out the window. The physical difficulty of squeezing more CPUs, GPUs, and memory on tiny silicon die will still exist, but AI optimisation may help speed up chip performance.

 

Any chance that AI can help speed up current chip production is welcome news. Though the U.S. Senate recently passed a US$52bn bill to supercharge domestic semiconductor supply chains, its largest tech firms remain far behind. According to Holger Mueller, principal analyst at Constellation Research, “the faster and cheaper AI will win in business and government, including with the military”. 

 

All in all, AI chip optimisation could allow Google to pull ahead of its competitors such as AWS and Microsoft. And if we can speed up workflows, design better chips, and use humans to solve more complex, fluid, wicked problems, that’s a win—for the tech world and for society. 

 

 

Share article