When IBM previewed its first quantum development roadmap in 2020, it laid out a pioneering timeline for progressing quantum computing.
In 2022, IBM updated its development roadmap to present an equally ambitious plan for scaling quantum systems beyond old limitations and toward advantage. At this time, it also unveiled the 433-qubit Osprey processor, just one year after breaking the 100-qubit barrier with its 127-qubit Eagle chip.
“We are continuously scaling up and advancing our quantum technology across hardware, software and classical integration, to meet the biggest challenges of our time, in conjunction with our partners and clients worldwide,” said Dr Darío Gil, Senior Vice President and Director of Research at IBM. “This work will prove foundational for the coming era of quantum-centric supercomputing.”
This year, IBM is on track to deliver its 1,121-qubit Condor processor, which will push the limits of what can be done with single-chip processors and controlling large systems.
Richard Hopkins, Distinguished Engineer at IBM and Fellow of the Royal Academy of Engineering, has spent the last 30 years with IBM, working with clients and solving their problems, particularly in the UK government space.
As he explains, the state of quantum for IBM in 2023 is unique in a number of ways. “First of all,” he describes, “it's probably the first time in IBM that I've seen a roadmap published quite so far out as we've published it.
“We're publishing at least three years out every year. In that regard, it is entirely different from anything else I've seen us do in 30 years.”
This roadmap is founded on principles of transparency, Hopkins explains. “We recognised very early on that there was a potential for hype and misinformation,” he adds. “As a result, we have adopted a transparency policy to the likes of which I've never seen, and that has worked enormously well.”
The roadmap to advantage
Capable of solving problems up to 100 million times faster than traditional computers, quantum computing has the potential to comprehensively speed up processes, on a monumental scale. Quantum computers use qubits, which can be 1 and 0 simultaneously, allowing these machines to handle much more complex problems. Quantum computers might one day run revolutionary algorithms that could, for example, search unwieldy databases or factor large numbers — including, importantly, those used in encryption.
“This year we will hopefully announce our Condor processor with over 1,000 qubits,” comments Hopkins. “That, in itself, is a good step forward, but it's not enough in its own right.”
IBM’s quantum roadmap also explains its plans to bridge multiple quantum processing units (QPUs) together. These techniques, it says, will be used to help it reach a 4,000 qubit machine.
“We've also announced that we intend to improve the gate fidelities and the coherence of our chips, so that we'll be able to execute 100 qubits with 100 gates: a 100 by 100 challenge with error mitigation,” Hopkins adds. “That will be another leap forward in terms of accuracy and coherence.
“It's a multilayer roadmap and the idea is that all those things are going to come together,” Hopkins explains. “These aren’t scientific projects, they are engineering projects, all aimed at actually delivering real-world benefits for our clients.”
Quantum machine learning
Research has shown that quantum computers have the potential to boost the performance of machine learning (ML) systems, and may eventually power efforts in fields spanning everything from drug discovery to fraud detection.
Most people will have been inconvenienced at some point by a payment being refused, or may perhaps have fallen victim to a fraudulent transaction. Algorithms used in the payment card industry mean this is, fortunately, a rare occurrence. But, as Hopkins explains, even small improvements to those algorithms will have a sizeable impact, with evidence already demonstrating that quantum computers can help to resolve these common problems.
“We've recently published a paper where we took information about real debit and credit card details and transactions, and passed them through a quantum algorithm and two conventional algorithms, XG Boost and Random Forest,” Hopkins says.
“Even with today's quantum hardware, providing we let the quantum computer select the parameters to predict the fraud, then we would get the same level of accuracy out of an intermediate-scale quantum computer,” he explains. “That, in itself, is not bad going, but it doesn't get you to that quantum advantage.”
However, as Hopkins describes, the quantum algorithm was able to make qualitatively different judgements and, as a result, come to different conclusions.
“When we look at these results more closely, we found that, first of all, the quantum algorithm chose different parameters,” he says. “And then, when we looked at the results again, we saw that it was making qualitatively different judgments. The accuracy was the same, but it was making a judgement on different elements and coming to different conclusions in many cases.”
As Hopkins describes, these hybrid applications will have a number of use cases, particularly in the world of ML.
“I think what you're going to see, especially in the ML space, is that these hybrid algorithms will emerge fairly early on, where you are combining the power of an existing algorithm that can run at high speed with a quantum algorithm, which will run slower, but will actually take a qualitative different decision, using a completely different algorithm than the other one,” he says.
“In the commercial space, you'll see these hybrid algorithms begin to dominate, where you're using both together to come up with something that's better than we could do today on a classical computer or even on a supercomputer.”
The future of quantum
The potential for quantum computing is immense. It can open up new opportunities in AI and ML, with a growing research field in quantum ML identifying ways that quantum algorithms can enable faster AI.
“Ideally we want to completely isolate people from the idea that they're using a quantum machine,” Hopkins says. “The vision is, you write a programme that performs some predictions or optimisation, or generally does what supercomputers are good at. And then, you'll send that off by calling a simple operation. Some of that query will run on a conventional computer, some of which might run on a GPU, some of which will run on one or more QPUs, but the idea is, eventually, it'll be completely invisible to you.”
As Hopkins explains, quantum is likely to be more useful in the future than many had perhaps conceived.
“Quantum is going to be a much more powerful capability than, I think, many people were envisioning,” Hopkins concludes. “I think people had in their minds that quantum computers don't become useful until you've got logical qubits. What we're working out is how to get value out of these things for our clients in the near term.
“Eventually, I'm sure we'll get to very large numbers of logical qubits, but we don't want to wait until that point to get value, and neither do our clients. So, we're doing something different.”
- Apple 'Investing Heavily' in Gen AI Strategy, says Tim CookIT Procurement
- How Dell AI & Data Solutions Help Improve Driver SafetyAI & Machine Learning
- OpenText Report: IT at Forefront of Sustainability EffortsDigital Transformation
- Veritas: How Gen AI Tools are Transforming the WorkplaceIT Procurement