Skip navigation
Like what you’re reading?

Why you should be interested in quantum-inspired technologies?

Quantum-inspired technologies are starting to show impressive capabilities in recent applications. Gradually, it is becoming clear that these novel tools have real potential to forge a new, different, and more powerful AI.

This article is one of a series of blog posts to introduce the nascent discipline of Quantum-inspired Machine Learning (QiML). In this post, the tenets, advantages, and results of this new approach are explained at a level that should also be accessible to readers without a deep technical background.

The dawn of Quantum-inspired Machine Learning

The microprocessor is one of the most fascinating inventions in digital computing. This device is at the heart of the digital revolution, which has radically transformed the way we think and function as humans and as a society. Nowadays, every smart device is based on the use of one or more microprocessors. For example, it is used as the central processing unit (or CPU) in computers. The more transistors one can pack in a chip, the more powerful this chip becomes. One way to improve a chip is by the simple reduction of the transistor dimensions. Quantum-inspired approaches eventually started to emerge because of the difficulties faced by continuously improving the design of CPUs. Our journey starts from this point.

The first microprocessor was manufactured by Intel and was made of 2,300 transistors, reaching 188 transistors per square millimeter (see Fig. 1). In 1971, this was a staggering achievement. Decades later, in 2019, this same company (along with others) has been able to develop much more advanced microprocessors by packing over 100 million transistors in a square millimeter. As intuitive to realize, the more transistors on a chip, the more powerful the chip.

The primary strategy employed over the decades to achieve such advancements has been the reduction of transistor sizes. For instance, their typical channel length went from 10 micrometers (10 μm) in 1971 to a few nanometers today, that is, three orders of magnitude smaller. Today it has become clear that (1) the emergence of quantum effects appearing at extremely small scales leading to energy loss and unpredictable behavior in the transistor, along with (2) the immense costs to further reduce the dimensions of transistors will eventually hinder the development of better chips 1. This is one of the reasons why microprocessor manufacturers today do not increase the clock speed (higher clock speed means more operations can be executed in a given period, translating to faster processing performance) of their devices anymore, since quantum effects and power leakage become more problematic at extremely small scales, which in turn impacts the heat generation and energy efficiency of the chips. Instead, they add more cores to these chips.

Scientists and engineers have started to look for alternatives to the reduction in transistor size a long time ago. For instance, physicists have been looking for alternative semiconductor materials, although, unfortunately, a viable alternative semiconductor material that can be widely adopted in the industry has not yet been identified. Others have suggested the introduction of quantum bits (or qubits) and gates to harness new quantum effects and potentially achieve unprecedented computational capabilities 2. While a traditional bit in a computer can represent data as zero or one, a quantum bit, or qubit, can store information in what is called “superposition”. This means that a qubit can represent zero and one simultaneously, thanks to a unique property of quantum mechanics. In essence, a qubit can contain a much richer type of information than a standard bit, existing in multiple states simultaneously until it is measured.

Theoretically, this might sound like a promising venue but, in practice, it has shown (alas, too many times) to be affected by serious experimental issues. For instance, a critical challenge is represented by the occurrence of quantum decoherence. When decoherence occurs, qubits lose the ability to maintain superposition and entanglement, which are key features that enable quantum computers to process complex computations more efficiently than classical computers. This disruption forces qubits to behave like normal bits, losing their quantum advantage. This challenge, among others, is a significant obstacle to the practical development and operation of quantum computers 3.

Despite these and other challenges, companies such as IBM continue to explore the quantum computing arena in the hope of obtaining massive computational advantages. More specifically, IBM has been able to put more than a thousand qubits in one single chip, which represents an experimental step in the right direction. Unfortunately, these qubits appear to be affected by external noise (e.g., thermal fluctuations, electromagnetic radiation), decoherence, and other problems 3,4,5. In other words, the feasibility of quantum computing could be decades away. So, it is not surprising to see how many computational fields are trying to come up with more practical, near-term alternatives. In this post, we focus on the field of machine learning and the various quantum-inspired tools that can be utilized right away to improve the field of ML.

The Intel 4004 microprocessor (early versions were ceramic chips). This 4-bit microprocessor was released in 1971 and was based on a 10-micrometer node technology. It contained 2,300 transistors.

Fig. 1: The Intel 4004 microprocessor (early versions were ceramic chips). This 4-bit microprocessor was released in 1971 and was based on a 10-micrometer node technology. It contained 2,300 transistors.

The philosophy behind Quantum-inspired Machine Learning

At first sight, the main strategy behind QiML seems to be similar to the one behind quantum computing, which is to leverage the unique behaviors of quantum systems to achieve computational advantages, such as faster and more accurate computations and improved memory utilization. However, there is a substantial difference between the two approaches: quantum computing would use actual physical quantum systems, but in QiML the focus is digitally simulated quantum systems, from which comes the term quantum-inspired. Not every quantum effect or system can be simulated on a digital machine. Therefore, a major challenge in Quantum-inspired Machine Learning (QiML) lies in identifying which effects can be efficiently simulated to achieve a computational advantage.

This alternative strategy readily brings advantages that are hard to imagine given the current state of quantum computing. For instance, decoherence does not represent an issue any more in this new view since quantum systems are simulated and not physical implementations. Moreover, this method doesn't require the use of specialized quantum hardware, such as cryogenic cooling systems used to maintain operational conditions for quantum processors, since the systems can be simulated on standard digital computers. Even though it is a relatively new field, preliminary studies have already shown that QiML holds significant value. In fact, it has already been possible to show that quantum-inspired algorithms can perform well in practice, if certain conditions for these methods are valid 6.

An example of existing quantum-inspired technologies

A quantum-inspired method for ML that has been recently developed at Ericsson has already shown an improvement in the training phase of an ML model7, addressing a relevant numerical problem with practical implications. In this specific method developed at Ericsson, it is observed that the gradient descent method (the mainstream method used to train neural networks) can be interpreted physically. It is thus possible to obtain a new set of quantum-inspired update rules to optimize the parameters of an ML model. This new set of equations represents coupled one-dimensional electrons and, therefore, the training of a model becomes equivalent to the computationally affordable simulation of a simple quantum system. The alignment of the system with the laws of (quantum) physics ensures minimization of the specified loss function, making energy reduction a direct result of the simulation..

With this approach, several advantages are readily obtained:

  1. The exploration of the space of solutions is performed by exploiting the simulated tunneling effect, which allows a fast convergence during the training phase of an ML model,
  2. No gradient needs to be computed, which allows to avoid the problem of vanishing or exploding gradients,
  3. It is possible to reach a level of parallelization that is hardly imaginable with the current training methods, and
  4. No specialized quantum hardware is required in this case and, off-the-shelf computers can be utilized right away.

These results clearly show that this approach can pave the way towards a new paradigm that may help better manage or solve scientific and industrial problems.

What’s next?

These are exciting times in the field of QiML. We have just started to realize how quantum-inspired technologies can alleviate the typical issues currently affecting the field of ML. For instance, we have seen that the method described in7 has advantages that were hardly reachable before. First, being a gradient-free approach, it allows the training of sophisticated artificial neural networks, which are challenging to train today using the more mainstream methods. Second, the QiML approach can run without the use of specialized hardware and avoids the need for multiple training sessions, consequently reducing energy consumption. Third, it can be utilized immediately in already existing ML development pipelines used by practitioners today. Therefore, the QiML approach can be applied directly since it does not represent a drastic departure from mainstream ML development practices.

It is becoming clear that quantum-inspired technologies are going to play a prominent role not only in telecommunications but also at large in other technical fields. While quantum computing seems to have practical applications decades from now, QiML offers today a robust and valid technology.

I will continue to write about QiML with more exciting results! Stay tuned!

  1. S. Datta, “Quantum transport: atom to transistor”, Cambridge University Press, 2nd edition, (2005).
  2. M.A. Nielsen, I.L. Chang, “Quantum Computation and Quantum Information,“ Cambridge University Press, (2010).
  3. M. Dyakonov, “The case against quantum computing,” IEEE Spectrum, 15 Nov., (2018).
  4. Scientific American, last visited 09 Feb. 2024.
  5. A. Katwala, “Quantum computing has a noise problem,” Wired UK, last visited 09 Feb. 2024.
  6. J.M. Arrazola et al., ”Quantum-inspired algorithms in practice,” Quantum, Vol. 4, p. 307, (2020).
  7. J.M. Sellier, ”On a quantum-inspired approach to train machine learning models”, Applied AI Letters, Vol. 4, Issue 4, (2023).
The Ericsson Blog

Like what you’re reading? Please sign up for email updates on your favorite topics.

Subscribe now

At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.