Quantum Computing- The UK and Europe play catch-up with the USA and China.
24-03-2021 | By Paul Whytock
The “my Quantum computer is bigger than yours” game has played out for many years, and the leading contenders in the Qubits superiority race are the USA and China.
Now Europe wants to get a seat at the big Quantum table, and there are EU consortiums and British led partnerships aiming to not only develop a hyper-fast computer but crucially, one that has many practical applications commercially.
So what are they up against? Well, the machine to beat at present is the Chinese computer called Jiuzhang, which the Chinese claim is just a mere 10billion times faster than Google’s current offering. China says this gives them Quantum supremacy, but then they would because that’s exactly the term used by Google to describe its Quantum offering.
Is there a difference between the Chinese machine and Google’s? Yes, there is. Jiuzhang makes its calculations using optical circuits, whereas Google's uses Sycamore, which is superconducting materials on a chip, a design that resembles classical computers.
But, in the technological chest-thumping world of Quantum computing, there is just one boast that everyone wants to make, and that is, “mine’s the fastest.”
In the need-for-speed, China’s Jiuzhang computer is claimed to be 100 trillion times faster than supercomputers. This means in seconds. It can do what normal computers would take millions of years to achieve. These figures are impressive, but a word of caution here does depend on what test the Quantum computer was given to perform as different tests can produce different computational speed results.
Nevertheless, the speed of true Quantum computing is mind-boggling, to say the least, and the real question is how these speeds are achieved? Qubits are how.
Normal computers can only calculate using bits that have only two working states that of 0 or 1. Quantum machines have bits (Qubits) that can provide numerous different states simultaneously. This is what gives them a tremendous speed boost. Get a load of these Qubits in a synchronised linkage, and they can calculate in seconds what would take a conventional computer millions of years.
Qubits represent atoms, ions, photons or electrons and give Quantum computers their inherent parallelism. This means that whereas a conventional computer will work on a single calculation, a Quantum computer can simultaneously work on millions.
The Quantum Failings
But it’s not just all about the speed. Quantum computing falls in a big way in three areas, and these are, firstly, exactly what tests were made to achieve certain speed results. Secondly, are Quantum computers reliable and, thirdly, what practical applications can they handle that makes them a commercially viable proposition?
The point about speed tests is that not all speed tests are created equal. Quantum computers have to be set up to perform a specific function. To test Jiuzhang, the computer had to calculate the output of a complex circuit that used light. It detected an average of 40 outputs, and it’s time to do that was a mere three minutes, whereas one of the world’s fastest supercomputers would have taken two billion years to reach the same conclusion. But this was a specially-tailored test and didn’t necessarily have relevance to broader applications in the commercial world.
Google’s Sycamore testing also came into scrutiny from rival IBM, and again the discussion came down to how relevant was the testing in terms of real-world practicality.
Deep Thought Cannot Compete
So given these out-of-this-world performance figures, it makes Hitch Hiker’s Guide to the Galaxy’s supercomputer Deep Thought look pretty pedestrian. It took Deep Thought a pedestrian 7.5 million years to decide the answer to the question of life, the universe and everything was 42.
Another operational shortfall with Quantum computing is reliability. By their very nature, Qubits are not durable and can easily be upset and need to be in a perfect, temperature-controlled environment that is totally free of vibrations and ambient atomic structures. This, of course, can be created to keep the Qubits bits happy. Still, the length of time they will operate efficiently and accurately is minimal before they technically slow down and abdicate their Quantum coherence.
So while we are all astonished at examples of their computational speeds, Quantum computers are not anywhere near becoming a commercially viable proposition.
Enter the first European consortium that has ambitions to change all that. It’s snappily titled the German Quantum Computer based on Superconducting Qubits (GeQCoS) group. Munich chip-maker Infineon and scientists from five research institutes in Germany aim to drive forward the development and industrialisation of Quantum computing.
According to Infineon, Quantum computers have the potential to replace existing conventional computers in specific applications. They could, for example, calculate simulations of complex molecules for the chemical and pharmaceutical industry, complicated optimisations for the automotive and aviation industry, or new findings from the analysis of complex financial data.
The project is funded by the German Ministry of Education and Research and hopes to create a Quantum processor based on superconducting Qubits and demonstrate its special capabilities on a prototype within four years. Working together to achieve this are scientists at the Walther Meisner Institute of the Bavarian Academy of Sciences and Humanities and the Technical University of Munich, the Karlsruhe Institute of Technology, the Friedrich Alexander University of Erlangen-Nuremberg, the Forschungszentrum Jülich and the Fraunhofer Institute for Applied Solid State Physics and Infineon.
“If we in Germany and Europe don’t want to be dependent for this future technology solely on American or Asian know-how, we must move forward with the industrialisation now,” explained Sebastian Luber, senior director of technology & innovation at Infineon.
Naturally, Germany is not alone in its bid to gain Quantum supremacy. The VTT Technical Research Centre of Finland is also part of a consortium seeking a Quantum technology lead.
The Key Processing Ingredient
It correctly believes superconducting processors could become a key ingredient for creating the next generation of supercomputers. Firstly, they could help tackle the major challenge of scaling up Quantum computers and secondly, they could speed up traditional supercomputers and drastically cut their power consumption.
A multidisciplinary research project led by VTT will tackle one of the main technical challenges to achieve this, the data transfer to and from low temperatures required for superconductivity.
The VTT consortium consists of Tampere University in Finland, KTH Royal Institute of Technology in Sweden, ETH Zürich in Switzerland and PTB, the national metrology institute of Germany, and corporate partners Single Quantum in the Netherlands and Polariton Technologies in Switzerland. It is a three-year project.
We know that a Quantum computer's processing power is based on superconducting Qubits operating at extremely low temperatures, and Qubits are typically controlled by conventional electronics at room temperature and connected through electrical cables. However, when the number of Qubits eventually rises to the required level of hundreds of thousands, the number of control cables to match the number of Qubits will generate an extreme heat-load that considerably inhibits Quantum's speed processors.
One solution is to control the Quantum processor with a nearby classical processor. A promising solution is to use the single flux Quantum (SFQ) technology which emulates traditional computers in logic but uses superconducting technology instead of conventional semiconductors. Because it requires low operational temperatures, SFQ has rarely been used in traditional computers. This disadvantage, however, turns into an advantage when used in combination with superconducting Quantum computers.
But a major challenge remains. Calculation instructions come to the SFQ processor from a conventional supercomputer, and calculation results must be sent back from the SFQ processor to the same machine. This requires data transfer between extremely low temperatures and room temperatures which doesn’t suit conventional semiconductors.
The VTT project’s vision is to replace electrical cables with optical fibres and suitable converters which convert optical signals to electrical signals and vice versa. Unlike existing solutions, these components must be able to operate at low temperatures. This will require the development of innovative converters that can drive and read out a simple SFQ processor.
Besides Quantum computers, conventional supercomputers could benefit from the development of optical connections for SFQ technology. A major limitation of supercomputers is the extremely high-power consumption of CPUs and GPUs due to the silicon chips' energy dissipation. Replacing silicon chips with superconducting SFQ chips in GPUs could have a notable impact on supercomputers' performance and power consumption.
The Deep Freeze
Here in the United Kingdom, Oxford Instruments Nanoscience announces significant innovation in its Cryofree dilution refrigerator technology. It believes the advancement of its ProteoxLX, a dilution refrigerator, will take the research into Quantum computing to the next level, enabling its commercialisation globally.
Since the launch of Proteox at APS Physics last year, Oxford Instruments has announced its partnership with the University of Glasgow and Rigetti and Oxford Quantum Circuits. Oxford Instruments NanoScience has also secured significant wins outside of Europe, more recently with Proteox selected by SpinQ Technology in China.
“NanoScience is committed to driving leadership and innovation to support the development and commercialisation of Quantum computing around the world,” explained Stuart Woods, managing director of Oxford Instruments NanoScience.
The ProteoxLX can maximise Qubit counts with large sample space and ample coaxial wiring capacity, low vibration features for reduced noise and support of long Qubit coherence times and full integration of signal conditioning components.
The LX also provides two fully customisable secondary Inserts for an optimised layout of cold electronics and high-capacity input and output lines, fully compatible and interchangeable across the Proteox family. Finally, the ProteoxLX offers 25 µW cooling power available at 20 mK, low base temperature at < 7 mK, and twin pulse tubes providing up to 4.0 W cooling power at 4 K.
All these UK and EU corporate and academic consortium driven projects to advance Quantum computing should give the US and Chinese technologists some challenges relative to who stays ahead in the race to develop a commercially viable machine. Still, I don’t expect either the US or China will be resting on their Qubit laurels.