How Google is Tackling Quantum Computing Challenges

Share
Google DeepMind and Google Quantum AI introduce AlphaQubit
Google DeepMind & Quantum AI teams develop AlphaQubit, an AI system that accurately identifies errors in quantum computers for more reliable quantum tech

Quantum computing has long been heralded as a revolutionary technology with the potential to solve complex problems in fields such as drug discovery, material science and AI.

However, the inherent fragility of quantum bits, or qubits, being the units of information that have properties that make them more powerful than bits in classical computers, has hindered the development of reliable quantum computers.

This means that error correction has emerged as a critical challenge in the field, as quantum systems are highly susceptible to noise and environmental disturbances.

Addressing these challenges, Google has made a significant breakthrough.

In a paper published in Nature, researchers from Google DeepMind and Google Quantum AI have introduced AlphaQubit, an AI-based decoder that identifies quantum computing errors with state-of-the-art accuracy.

The problem of qubit fragility in quantum computing

Quantum computers harness the principles of quantum mechanics, such as superposition and entanglement, to perform calculations.

“AlphaQubit represents a major milestone in using machine learning for quantum error correction.”

Google

These machines use qubits, which can exist in multiple states simultaneously, allowing them to process vast amounts of information in parallel.

This capability enables quantum computers to solve certain problems exponentially faster than classical computers.

However, Google says in a blogpost: “The natural quantum state of a qubit is fragile and can be disrupted by various factors: microscopic defects in hardware, heat, vibration, electromagnetic interference and even cosmic rays (which are everywhere).”

Even cosmic rays can disrupt their delicate quantum states.

This sensitivity leads to errors in quantum computations, making it challenging to perform long and complex calculations reliably.

How AlphaQubit is correcting quantum computing errors

To address the issue of qubit fragility, researchers have developed quantum error correction techniques.

These methods involve grouping multiple physical qubits into a single logical qubit and performing regular consistency checks.

The AlphaQubit system uses these consistency checks to identify errors in the logical qubit, allowing for their correction.

According to the research paper, AlphaQubit is a neural network-based decoder that utilises the Transformer architecture, which is also used in many of today's large language models.

The system was trained on data from a set of 49 qubits inside a Sycamore quantum processor, Google's quantum computing hardware.

The training process involved using a quantum simulator to generate hundreds of millions of examples across various settings and error levels.

The researchers then fine-tuned AlphaQubit for specific decoding tasks using thousands of experimental samples from a particular Sycamore processor.

When tested on new Sycamore data, AlphaQubit demonstrated superior accuracy compared to previous leading decoders.

Key attributes of AlphaQubit:
  • AlphaQubit is an AI-based decoder designed to enhance quantum computing reliability by accurately identifying errors
  • It uses Transformer-based neural networks, a deep learning architecture, to decode errors with high accuracy
  • The system was trained on Google's Sycamore processor using hundreds of millions of error examples, achieving significant accuracy improvements over existing methods
  • AlphaQubit reduces errors by 6% compared to tensor networks and by 30% compared to correlated matching in tests

Meanwhile in the largest Sycamore experiments, AlphaQubit made 6% fewer errors than tensor network methods, which are highly accurate but impractically slow.

It also made 30% fewer errors than correlated matching, an accurate decoder that is fast enough to scale.

AlphaQubit’s potential for future systems

To assess AlphaQubit's potential for larger quantum devices, the researchers trained it using data from simulated quantum systems of up to 241 qubits.

The results showed that AlphaQubit outperformed leading algorithmic decoders, suggesting its viability for future mid-sized quantum devices.

The system also demonstrated advanced features, such as the ability to accept and report confidence levels on inputs and outputs.

These information-rich interfaces can potentially further improve the performance of quantum processors.

Notably, when trained on samples that included up to 25 rounds of error correction, AlphaQubit maintained good performance on simulated experiments of up to 100,000 rounds.

This demonstrates its ability to generalise to scenarios beyond its training data, a crucial factor for practical quantum computing applications.

The future of practical quantum computing

While AlphaQubit represents a significant milestone in using machine learning for quantum error correction, challenges remain.

Youtube Placeholder

“AlphaQubit represents a major milestone in using machine learning for quantum error correction. But we still face significant challenges involving speed and scalability”, Google says.

“While AlphaQubit is great at accurately identifying errors, it’s still too slow to correct errors in a superconducting processor in real time.”

Fast superconducting quantum processors perform millions of consistency checks per second, and AlphaQubit is currently too slow to correct errors in real-time.

As quantum computing advances towards systems with potentially millions of qubits needed for commercially relevant applications, researchers will need to develop more data-efficient ways of training AI-based decoders.

"Our teams are combining pioneering advances in machine learning and quantum error correction to overcome these challenges — and pave the way for reliable quantum computers that can tackle some of the world's most complex problems," Google concludes.


Explore the latest edition of Technology Magazine and be part of the conversation at our global conference series, Tech & AI LIVE.

Discover all our upcoming events and secure your tickets today.


Technology Magazine is a BizClik brand 

Share

Featured Articles

Apple Announces Latest Saudi Arabia Tech Sector Expansion

Apple plans retail locations in Saudi Arabia and increases developer training programmes as part of strategy to strengthen Middle East tech sector

SAP: AI & Data Key to Closing COP29 Climate Commitments Gap

SAP’s CSCO Sophia Mendelsohn on how AI and data collection could help companies meet climate targets set at COP29 conference in Azerbaijan

PwC and AWS Forge Path for Regulated AI Adoption

Professional services firm PwC and AWS collaborate on automated reasoning tools to reduce AI hallucination risk in regulated sectors

Nvidia Predictions: AI Infrastructure Set to Shift in 2025

AI & Machine Learning

Nvidia & AWS’s AI Breakthroughs at Re:Invent 2024

AI & Machine Learning

SAP and AWS Partner on AI-Powered Cloud ERP Platform GROW

Cloud Computing