The famous physicist Richard Feynman once said: "If you think you understand quantum mechanics... then you don't understand quantum mechanics." In fact, he claimed that not even he himself fully understood the laws of the microscopic world, where everything is and is not at the same time. But despite the fact that this is a field completely removed from intuitive reason, it has been on everyone's lips in recent years: concepts such as computation, communication, or even "supremacy" followed by the word "quantum" have sparked unusual interest, making headlines around the world and emerging as a promise of the arrival of a disruptive technology destined to change our world: they will be able to perform tasks that would take classical computers millions of years to complete in just a few days, minutes, or seconds, which in the world of Big Data, artificial intelligence, and machines capable of learning on their own (so-called "machine learning") would be an incredible leap. Creating new personalized drugs or predicting movements in financial markets and optimizing investments would be possible with these devices. But it also has drawbacks: some warn that, for example, all current cybersecurity is already threatened by its unstoppable development. So, what's really behind it? Where are we at? Are we facing an imminent revolution, or perhaps just overwrought excitement?
The main difference between a quantum computer and a classical computer is their communication system, the basis for transmitting information. Our computers communicate with each other through 'bits,' the binary language that, through complex mathematical calculations, converts information into ones and zeros. However, in quantum computing, systems 'speak' in 'qubits,' which can be 1 and 0 at the same time (using the same principle that governs the famous Schrodinger's cat, alive and dead at the same time), which exponentially multiplies the performance of this technology. And not only that: a phenomenon called quantum entanglement occurs between the qubits, whereby they are capable of 'communicating' with each other over enormous distances without anything, any transmission channel, further expanding their possibilities. However, we still do not have the hardware or the machines that can efficiently take advantage of these qualities.
A very fragile and buggy technology
ENIAC project, one of the first classical computers that could perform multiple functions and be reprogrammed
–
US Army Photo
"Many people like to post a photo of the ENIAC, that supercomputer from the 1940s that took up a single room," says Juan José García Ripoll, a researcher in the Quantum Information and Foundations of Quantum Theory group at IFF-CSIC. "I think that photo is a step further than where we are in quantum computing. We're still learning how to add." García Ripoll explains that quantum computing has two open fronts: on the one hand, we must continue researching basic science or algorithms, which are similar to the programs a quantum computer can execute; and, on the other, we must address the technology, since right now, maintaining a qubit for even a few seconds requires an almost prohibitively expensive infrastructure: highly sensitive superconducting circuits are used that must maintain incredibly low temperatures, around -272°C, so that energy dissipation doesn't degrade quantum information. Or they must be subjected to very low pressures and, at the same time, isolated from the Earth's magnetic field. If these requirements are not met, a lack of coherence or quantum decoherence occurs, and all operations are corrupted. In other words, the quantum computer fails to function.
"In ten years, there will be industrial applications. And ten years isn't that long."
"It's true that we're in our infancy, just beginning to enter the field of engineering," explains Sergio Boixo, Google's chief scientist for quantum computing theory. "But being in its infancy doesn't mean we won't advance rapidly. In ten years, there will be industrial applications. And ten years isn't that long." The ideas of this engineer and mathematician from León gave rise to the theoretical part of one of the most important (and media-covered) milestones in the field in recent years: quantum supremacy. Despite its name, this is the practical demonstration that a quantum computer can perform a task that would take the best classical supercomputer so long that it wouldn't be worth solving with this system. To this end, the Mountain View, California-based technology giant created a specific problem for the demonstration: generating patterns in a series of random numbers following a predetermined formula. But the most surprising thing, according to the study published by the journal 'Nature' in October 2019, is that the quantum computer took just 200 seconds compared to the 10,000 years taken by the most powerful supercomputer at the time - although this claim was not without controversy.
Now Google, with Boixo back at the helm, is striving to achieve a new milestone: creating a logical qubit and reducing errors, which all experts point to as the next step in the conquest of quantum computing. "Until now, we've worked with physical qubits in superconducting circuits. The problem is that they have errors. To solve this, and although it may sound contradictory, you have to combine many physical qubits so that redundancy causes error correction to be more efficient than the errors you add," explains Boixo. In other words, adding physical qubits reduces the errors in the logical qubit. "And this hasn't been demonstrated yet," he points out, explaining that of the two types of errors in the qubit—which cause its "life" to be very short and barely survive a few microseconds—they have already been able to solve each one separately and are "close" to achieving the formula to correct both at the same time.
Once this step is achieved, the next step will be to continue introducing physical qubits until achieving an error-free logical qubit. "That will mean the prototype will be able to perform operations first for days, then weeks, then months... And so on until we build a prototype that stays on as long as we want, until we just flip the switch to turn off the light." The idea is to achieve a quantum computer with around 1,000 or 2,000 logical qubits; that is, one or two million physical qubits. "This final phase will take about ten years, although it's hard to say." For the record, Google's experiment had 53 physical qubits.
Quantum ecosystem
"If you develop the concept without involving society, when you show it to them, people won't know how to use it."
IBM, Google's main competitor in this area, also doesn't want to be left out of the quantum race. After being highly critical of Google's achievement of quantum supremacy—they claimed its Summit supercomputer could do the same in two and a half days—the company asserts that its goal isn't just that, but rather to build a 1,000-qubit quantum computer in two years. "Once we achieve that, a lot of possibilities open up," says Antonio Córcoles, a researcher on the IBM Research quantum team in Yorktown Heights.
At the same time, their intention is to create a "quantum ecosystem" in which all the advances in the field can be shared with anyone who might be interested: from scientific organizations to companies. "The idea is to create a community of researchers, educators, engineers, and companies that can work in an environment where access to these machines is possible and where they can learn what kinds of problems our machines can address and what kind of results they can obtain as we all advance." In other words, there must be a practical application. "If you develop the concept without involving society, when you show it to them, people won't know how to use it," Córcoles points out.
Quantum computing applications: What specific uses will it serve?
Artificial intelligence
One of the requirements of artificial intelligence is the ability to analyze large data sets. Currently, a huge amount of information has been generated, which traditional computers are often unable to handle. Quantum computers would make it possible to analyze and manage more data in much less time.
Machine Learning
Quantum computers could also enhance machine learning by allowing artificial intelligence programs to search these monstrous data sets for elements related to medical research, consumer behavior, and financial markets, and make sense of them. They might even find logic that the human mind hasn't been able to find.
Biomedical simulations
Currently, drug development involves years of laboratory experiments during the discovery, clinical, and preclinical phases. With the exponentially increased computational power of quantum computing, experts believe it will be possible to use computers to simulate the effects of different chemical compounds on organisms at the molecular level. This would allow new drugs to be designed with computers much more quickly and cheaply.
Tailor-made medicine
The field of quantum optics, which studies how matter and radiation interact at the quantum level, has the potential to control individual molecules through the radiation they emit and absorb, potentially altering, modifying, or even destroying them. The same could be done with cancer cells, destroying them without harming any healthy cells.
Optimization
Each process can have countless variables; with quantum computers, a machine can handle almost countless permutations and combinations, which could massively advance systems design and analysis. For example, it could be very useful in the logistics sector or industry in general.
Finance
Quantum computing could help, for example, fine-tune stock market investments by taking into account many more variables than classical computers currently allow.
Chemical industry
The chemical industry, for example, can work to identify a new catalyst for fertilizers that will help reduce greenhouse gas emissions and improve global food production. This requires modeling molecular interactions that are very complex for classical computers, but perfect for quantum computers.
New materials
One potential application is the development of more efficient superconducting materials, which in turn will allow for progress in the study of faster computers with greater memory, high-speed magnetic levitation trains, and the possibility of generating electrical energy more efficiently, for example.
Revolution or fiasco?
Because the crux of the matter is this: will quantum computing really be a revolution? For José Ignacio Latorre, professor of theoretical physics, current director of the Singapore Center for Quantum Technologies, and a world-renowned figure in the field, the answer is a resounding yes. And, in fact, it's already here. "We are controlling matter at the level of individual electrons, individual photons, individual ions... and we can encode information in these very basic elements of nature, and we also operate them with quantum logic, which has opened up a universe of opportunities and a certain disruptive technological leap. If we manage to make a quantum computer bigger and apply one of the algorithms we already have at our disposal, specifically the Short algorithm, that would put all current cybersecurity in jeopardy."
Is the hype phenomenon in science a good thing?
A topic becomes trendy and, for a while, everyone talks about it. Thousands of messages seek their place, trying to capture the public's attention through emotions, almost creating a bond with the viewer or reader. Despite everything, there comes a point at which attention wanes and is relegated to another topic. This phenomenon occurs in many fields, including science. And it's a controversial issue: on the one hand, the need to generate content can create hype or media hype that exaggerates the issue; but on the other, it can bring issues to the attention of society at large that might otherwise go unnoticed. So, is this good or bad? To find out, José Ignacio Latorre and his colleague Maite Soto-Sanfiel, a PhD in Audiovisual Communication from the Autonomous University of Barcelona and a researcher at the Center for Trusted Internet and Community at the National University of Singapore, are conducting a study to determine what level of hype is positive or negative.
"'Media hype' has already been extensively studied in the context of information in general, but this is the first time it will be associated with scientific content," explains Soto-Sanfiel, who details that the focus will specifically be on quantum technologies; health-related aspects, especially vaccines; and artificial intelligence. Through several experiments, showing different people different types of messages with varying degrees of manipulation, the researchers will try to observe how people's perceptions change in various aspects: from the credibility of sources to emotional well-being and how some messages, when repeated, can influence each person's particular mood. "There are studies that show that a little 'hype' can even be beneficial. The problem is when you go too far. Surely the balance lies in moderation," notes Latorre.
However, Latorre criticizes the fact that all this has created a breeding ground for governments, universities, and businesses alike to try to justify significant investments in the field, and to trumpet headlines that don't correspond to reality. "This has led to something that didn't happen in science: marketing services are being contracted out. And it pains many scientists deeply because we see small, marginal projects appear in the news," he says. And the problem with all this hype is that it can backfire on the scientists themselves: "The injection of money can slap us in the face because if the promises aren't kept after a while, the world becomes disenchanted, governments become disenchanted, and so do companies."
For García Ripoll, the quantum boom isn't the first he's witnessed. "I've experienced three or four of these revolutions: superconductivity, optical computers, artificial intelligence, and now quantum computing. They follow this Gartner hype curve and create a more competitive environment fighting for investment, in which, although I believe researchers are very rigorous, there can be some occasional 'noise.' But the good part of this hype is that it's attracting a lot of talent and at the same time fostering a very healthy race between quantum and classical computing. And that's fantastic." Whether a passing fad or not, for the moment, quantum physics seems to be, like its very nature, in two places at once: in the subatomic world and in the headlines. Although we still can't fully control it with our machines.