Closing in on quantum computing with error mitigation


The latest machine on IBM’s quantum computing roadmap, Heron, has been given a hardware and software boost as the company pushes towards its goal of error correction.

Error correction is seen as the holy grail for quantum computing, which would open the gates to commercial adoption. This may be many years away, but IBM Heron offers error mitigation, which the company describes as techniques that allow users to mitigate circuit errors by modelling the device noise at the time of execution.

In other words, it is something software developers need to do when programming IBM quantum computers to get around the noisiness in terms of errors that is inherent in today’s quantum computing technology.

“Advances across IBM Quantum hardware and Qiskit are enabling our users to build new algorithms in which advanced quantum and classical supercomputing resources can be knit together to combine their respective strengths,” said Jay Gambetta, vice-president of IBM Quantum.

“As we advance on our roadmap towards error-corrected quantum systems as a pillar of the future of computing, the algorithms discovered today across industries will be key to realising the full potential of unexplored computational spaces created by the convergence of QPUs [quantum processing units], CPUs [central processing units], and GPUs [graphics processing units].”

To tie in with the Heron announcement, IBM has introduced several new tools in its Qiskit software developer’s kit. These include tools such as the Qiskit Transpiler Service to power the optimisation of quantum circuits for quantum hardware with artificial intelligence (AI) and Qiskit Code Assistant to help developers easily generate quantum code with IBM Granite-based generative AI models.

It is also adding Qiskit Serverless, which enables software developers to run initial quantum-centric supercomputing approaches across quantum and classical systems and the IBM Qiskit Functions Catalog to make services available from IBM, Algorithmiq, Qedma, QunaSys, Q-CTRL and Multiverse Computing.

Error correction is the breakthrough

Tobias Lindstrom, head of science for NPL’s department of quantum technology, believes a step change in quantum computing will happen once error correction is fixed.

“Today, we’re limited by scaling because we don’t have error correction,” he said. “Once you can build a logical error-correct qubit, as far as I understand, there’s nothing stopping you from building more of them. It is an engineering challenge.”

Once there is error correction, “you may spend more money but there is no limit to the scaling”, he added, in response to the question of whether a working quantum computer would follow the same rules as Moore’s Law, which shows that the number of transistors on a processor doubles every two years for the same price.

While there has been a lot of progress in schemes focused on error correction, Lindstrom expects quantum computing adoption will accelerate when the techniques are eventually mastered.

Even if such a computer with perhaps 10,000 qubits has a ticket price of $1bn, Lindstrom believes the price is not likely to be a barrier for some organisations and governments: “I don’t think that’s going to stop people when you are talking about something as useful as a quantum computer.”

What this means is that quantum computers will likely only be initially purchased by governments or very large companies.

There is a certain class of problem which Lindstrom and many in the industry feel quantum computing will be able to optimise. Not surprising, he said, “quantum-type problems” such as quantum chemistry are among the big opportunities, where quantum computing can be applied in material science leading to opportunities such as the development of greener technologies.

While not fully fledged computers, Lindstrom described the UK Research and Innovation’s quantum test bed programme as “an important step”. These “demonstrators” of quantum technology provide a way for quantum computing firms to develop machines that organisations can have direct access to in the UK.

Solving problems and improving skills

Like IBM’s Gambetta, Lindstrom sees quantum devices as part of the mix that will be used to accelerate certain workloads: “A good analogy is probably something like using GPUs or FPGAs [file programmable gate arrays] in the context of high-performance computing. You’re still logging onto a regular computer, but for certain problems, you’re using a GPU or an FPGA.”

The era of quantum computing will, like with GPUs, involve the quantum processor effectively acting as an accelerator or co-processor for the CPU. Lindstrom believes that, in an ideal world, a programmer would use their preferred programming language and their source code compiler tool would then look through this code and decide which steps in the program requires an optimisation step and then assess whether this is best serviced by offloading the task to a quantum processor.

“That’s the ideal scenario, in terms of user friendliness, but it may not be the best way to use existing resources,” he said. 

For Lindstrom, there needs to be a group of specialist programmers who understand the computer architecture in depth: “I think a good analogue would be classical computers in the 1980s, where people were programming in assembly language to squeeze the most performance out of the hardware.”

Looking at current industry efforts, Lindstrom said that there is work to make quantum computing more accessible to people who do not necessarily have an in-depth background in the technology, but this is not possible today.

“For the foreseeable future, you will need a second category of people as well who really understand quantum computing and who can formulate the problem before they even start writing the code,” he said.

What this means from a skills perspective, as CIOs plan for a future where quantum computing is part of the technology mix is, according to Lindstrom, a similar story to the upskilling needed for GPUs.

“People are GPU-aware because, again, it has been part of the computing ecosystem for so long, but they don’t necessarily need to know how to build a GPU – they just need to understand the APIs [application programming interfaces] and what problems GPUs can be used for.”



Source link