Quantum datacentre deployments: How they are supporting evolving compute projects


Put lots of qubits together and you have quantum computing, requiring datacentres that can support it.

A quantum datacentre though is not just about having a building with a quantum computer in it. And there remain questions about what quantum computers should look like and how they should connect in the datacentre context, says Andrew Lord, senior manager, optical and quantum networks research at BT Group.

A datacentre in essence delivers connectivity, often connecting multiple people or customers with racks, compute equipment and the rest. Adding quantum compute as part of an overall compute resource that answers questions should help with certain challenges, although plenty of uncertainty remains.

Also focused on the long-term challenges of quantum datacentres and their connection at BT is specialist optical and quantum research professional Emilio Hugues-Salas. Solving for the “hard physics” around qubit-based quantum computer language versus classical computing’s zeros and ones could take five to 10 years, even though the work is moving “quite fast”.

A qubit is the quantum version of a binary bit of zeros and ones in classical computing. It’s a two-state or two-level quantum mechanical system. When measured, electrons can have apparent spin up and spin down with certain probabilities attached.

“Looking at definitions, at the requirements and architecture of a quantum datacentre, suggests an architecture that enables secure access to quantum compute, where you not only have the traditional GPUs but quantum processing units you can use depending on requirement and application,” Hugues-Salas says.

“But it’s like back when we didn’t know what the internet was for,” adds Lord. “Honestly, a lot is about making quantum available just to see what people will do with it – anything from finding new drugs to modelling basic molecules or photosynthesis.”

In such energy-hungry areas “just one” major breakthrough could prove the value and be worth the effort because it’s too costly with current technology.

“For example, I’d like to minimise energy consumption in the BT network, but it’s just too hard and will take too long. Then, by the time I’ve done it, things have changed,” says Lord. “And you might only need to access a quantum computer for a few crucial seconds to do it.”

BT also works on quantum networking and computing, including “anything quantum that might overlap with BT interests”, such as trialling quantum interconnect into “regular” Equinix datacentres in London.

The latter is a near-commercial grade project that’s mainly about secure linking. “Customers can then put their data onto the cloud securely, do operations and connect multiple their own customers via the cloud,” Lord says.

Solving problems in parallel

When it comes to quantum computing in general, Owen Rogers, senior research director of cloud computing at Uptime Intelligence, has an analogy: imagine you have a plastic combination padlock, but to unlock the padlock without the code, you do not want to have to try every possible combination.

“But let’s say that on the combinational rings, there’s a tiny bit of metal that happens to be on the correct number. Conceivably you might simply wave a magnet across the padlock to correctly interlock and unlock the correct numbers,” Rogers says. “Quantum computing is really a way of solving things in parallel, using the uncertainty of what particles do.”

Remaining challenges, though, are multi-faceted. Obviously, quantum algorithms require special skills, and then there are technical and engineering problems.

For instance, the more qubits you have, the more susceptible you are to noise. A sensitive individual particle must be kept in a state where you can control it. That means a cooling requirement, even cryogenic cooling.

“You have to remove as much interference as possible,” says Rogers. “And the datacentre has to have those abilities.”

Quantum computing research today is costly with only a small chance of success, but the benefits payoff if quantum computing can be made to work might be astronomical. For instance, if a team can quickly solve something that would have been impossible before or without making assumptions.

“However, we might reach a certain level of qubits and then the interference is exponentially worse, and we just can’t increase them, for example,” says Rogers.

In the UK, multiple projects in development include five new quantum research hubs announced in July 2024. Among these is Heriot-Watt University’s Integrated Quantum Networks (IQN) hub. The idea is that “quantum internet” linking quantum computers could deliver massive compute, leveraging quantum entanglement and memory.

Another is the industry-partnered QCI3 hub at Oxford University. QCI3 researches interconnected and integrated quantum computing implementations, eyeing an estimated potential $1.3tn market for quantum ML and neural networks by 2030.

Investment needed to realise gains

Dominik Andrzejczuk, CEO at QDC.ai with investments in two Oxford quantum hardware companies – trapped-ions technology based Oxford Ionics, and full-stack photonics focused Orca computing – confirms that the engineering challenges are taking time to solve.

That said, ion-trap architectures are good at controlling “very high-quality” qubits with the same CMOS fabrication techniques as superconducting qubits, with Andrzejczuk adding: “That means that potentially they could scale easily.”

His background in physics drew him to quantum, he tells Computer Weekly, but there’s a schism in the quantum sector between scientific computing, leaning into operations research from first principles and Silicon Valley AI’s work.

With artificial intelligence (AI), you take a bunch of random data and a machine figures out the function. Its strength is also its weakness, because you need so much hardware to work fast.

However, as has become apparent with certain large language models (LLMs), this doesn’t scale well, with Andrzejczuk  pointing out: “OpenAI is burning billions of dollars every single year.”

A scientific computing approach starts from the other end, with a physicist or mathematician examining the dataset, then developing the function to fit onto that data and then indicating the function, parameters and constraints to the machine. Related operations research can be highly specific to use cases in areas such as logistics with myriad variables and constraints – that’s harder for machine learning.

“One perfect use case is airlines and transport. If you’re delayed or cancelled, you have to call somebody to rebook your ticket. The magnitude of data for an ML algorithm to solve that is astronomical,” Andrzejczuk says.

Representing an optimisation problem in a classical computer can be simple, with integer values. But in a quantum context, quadratic, unconstrained binary optimisation means your variables have to be binary, rather than integers.

“Think of 600 trucks as some sort of sequence of ones and zeros,” Andrzejczuk says. “Then we need an extraction layer. We need to convert a problem that is, let’s say, semantically written in plain English, into some sort of binary code. Those tools just don’t exist right now.”

Further investment in the billions of dollars are still needed “to push it forward”, but if it works, “everybody wins”, Andrzejczuk adds.

Developing quantum potential and proofs

Jerry Chow, IBM fellow and director of quantum systems and runtime technology, agrees that it’s early days – they are really more than physics experiments, and progress is being made with deployable computational tools and quantum-centric supercomputing: “Right now [at IBM], we are exclusively putting out systems of 100 qubits or more. And we’re in a world that’s starved of compute.”

IBM operates 14 utility-scale quantum systems, including quantum datacentres in Poughkeepsie, New York and Ehningen, Germany as well as dedicated systems colocated with its clients.

IBM’s Quantum Network comprises about 250 enterprises, research institutions, startups, universities and industry leaders, including 80 in Europe. Its quantum roadmap factors in the time predicted to solve remaining challenges out to 2033.

“The point here is that quantum does certain workloads very differently,” says Chow. “We see that Quantum Network as how we’ll find and use these tools for quantum advantage.”

Multiple solutions will be combined, including QPUs, GPUs and CPUs. Hosting at Poughkeepsie has comprised “several” to “double-digit” numbers of quantum computers, varying by the processor used.

“At what we call utility scale, certainly there are quantum circuits beyond exact simulation with CPU or GPU resources. The next best ways of handling some of these circuits, in fact, are with maybe some tensor network methods or some kinds of approximate computing methods that leverage high performance nodes,” says Chow.

Performance depends on qubit numbers or scale, speed, and quality or error rate – accuracy of execution of quantum circuits. Users can try 127-qubit systems free; IBM offers 10 minutes a month of Quantum Platform execution time, with systems, documentation and learning resources.

IBM is hoping thereby to foster scientific and even business related demos delivering speed, accuracy or cost-effectiveness, not to mention ecosystem development in train from the domain-specific Qiskit function service to third-party middleware-type integrations.



Source link