Follow BigDATAwire:

July 24, 2023

Quantum Computing and AI: A Leap Forward or a Distant Dream?

Yuval Boger

(metamorworks/Shutterstock)

Artificial Intelligence (AI) has made significant strides in recent years, with tools and algorithms that can analyze data, recognize patterns, and make predictions with an accuracy that was unimaginable just a few decades ago. However, the question arises: Are these tools good enough, or do we need to look towards more advanced technologies like quantum computing?

The Case for Existing AI Tools

AI tools have proven their worth across various sectors, from healthcare and finance to transportation and entertainment. Machine learning algorithms can process vast amounts of data, learning and improving over time. Deep learning, a subset of machine learning, has enabled the development of neural networks that can recognize patterns and make decisions with a high degree of accuracy. These tools have been successful in solving complex problems and are continually improving.

Moreover, these AI tools are accessible and practical. They operate on classical computers, which are widely available and relatively affordable. They can be deployed in real-world applications today, providing immediate benefits to businesses and society. Since AI models are so good, perhaps help from quantum computers is not required.

(Morrowind/Shutterstock)

The Quantum Leap: Potential and Challenges

Quantum computing, on the other hand, is often touted as the next big thing in AI. Quantum computers can process a vast number of possibilities simultaneously. This could potentially speed up AI algorithms and process larger datasets more efficiently, leading to more powerful AI models.

A recent Boston Consulting Group study identified a market potential of $50B to $100B of quantum opportunities in generative, foundation, and horizontal AI, impacting practically all industries. According to BCG, additional multi-billion-dollar opportunities exist in preventing fraud and money laundering, as well as automotive AI algorithms.

However, quantum computing is still in its infancy. Today’s quantum computers have a limited number of qubits, and maintaining their quantum state, known as coherence, is a significant challenge. limiting the complexity of the computations that can be performed.

Moreover, quantum computers are not just an upgrade to classical computers; they require entirely new algorithms. For instance, classical machine learning models, such as neural networks, are trained by adjusting parameters (weights and biases) based on the input data, aiming to minimize the difference between the model’s predictions and the actual output. Sophisticated models have millions or billions of parameters and are tuned by a process called gradient descent – determining the direction in which changing the parameters results in minimizing that difference. However, measuring or estimating the gradients in a quantum computer is exceptionally difficult. Thus, trying to use a classical algorithm on a quantum computer is a recipe for failure, and new algorithms are required. Developing these algorithms is a complex task that, while promising, is still in the early stages. For instance, a new type of machine learning algorithm called “reservoir computing” appears to leverage unique quantum properties to achieve good results in both classification and prediction applications.

(rafapress/Shutterstock)

Quantum Computing and Generative Models

One area where quantum computers excel today is generating randomness. In classical computers, random numbers are generated using algorithms or from some external source of randomness (like atmospheric noise), but these numbers are not truly random: if you know the algorithm and its initial conditions (the seed), you can predict all the numbers that the algorithm will generate. In contrast, thanks to core principles of quantum mechanics – superposition – quantum computers can generate truly random numbers. Superposition shows that a quantum bit can exist in multiple states at once, and when measured, the outcome is inherently random.

Generative modeling, an unsupervised machine learning scheme, can benefit from this randomness. Quantum computers can create statistical correlations that are otherwise very difficult to replicate, making them ideal for this application. Such generative models can be used in numerous problems, such as portfolio optimization, where the generative model attempts to replicate high-performing portfolios discovered by the algorithm, leading to portfolios with much lower risk than those discovered by classical algorithms. Similar uses have been suggested for molecular generation for drug discovery and even for factory floor scheduling.

The Future of Quantum Computing and AI

Despite these early challenges, the potential of quantum computing for AI is immense. Quantum machine learning could classify larger datasets in less time, and quantum neural networks could process information in ways that classical neural networks cannot.

While existing AI tools are powerful and practical for many applications today, quantum computing represents a new frontier with the potential to significantly advance the field. However, the road to practical quantum computing is long and filled with challenges. It will likely be some time before quantum computers are more powerful and ready for widespread use in AI. Until then, the focus could be on maximizing the capabilities of our existing AI tools while continuing to explore the exciting possibilities that quantum computing offers.

About the author: Yuval Boger is the Chief Marketing Officer at QuEra, a company working to commercialize quantum computing. In his career, Boger has served as CEO and CMO of frontier-tech companies in markets including quantum computing software, wireless power, and virtual reality. His “Superposition Guy’s Podcast” hosts CEOs and other thought leaders in quantum computing, quantum sensing, and quantum communications to discuss business and technical aspects that impact the quantum ecosystem.

Related Items:

Is Quantum Computing the Future of AI?

Machine Learning Cuts Through the Noise of Quantum Computing

HPC + AI Wall Street to Feature ‘Spooky’ Science for Financial Services

 

BigDATAwire