# Effect of Google Quantum Supremacy on Data Science

**Prerequisite:** Know About Google’s Quantum Supremacy

In this article, we will discuss what are the benefits of Google Quantum Supremacy claim and what are the problems that can be solved using this meanwhile we will also discuss our role as a data scientist in Quantum Computing and where do they fit in this complete conundrum. So let’s get started with the article. The first question we need an answer to is:

### Why is it so Significant?

**1. Simulating Chemical Processes:** It is creating wonder materials. Take a simple molecule of Caffeine. It has approximately 248 states. We know we can’t even understand the basic structure of molecules today with classical computing. We can use a Quantum computer to simulate a quantum system. This will not only help us understand but to simulate and even manipulate the process to get a new material which is let’s say light and indestructible at the same time or selection of molecules for the creation of organic batteries or we might able to create drugs which might cure cancer or whatever our imagination allows us to create.

**2. Making Room Temperature Semiconductor:** Quantum computers rely on superconductors to function which have to be kept at extremely low temperatures(15 milliKelvin) but if a quantum computer can develop room temp superconductors, we all know what that means. Well, that means you might just be able to give your son a quantum pc on his birthday.

**3. Solving Some Intractable Problems:** Think about neural networks and the growing space of characterization using quantum computing for financial modeling and optimization of routes and logistics such as TSP problem that is considered to be insurmountable.

### Major Problems

These Qubits are prone to errors. Errors caused by qubits decay and losing information stored on them and to make what’s known as logical qubit that’s more coherent requires hundreds and thousands of physical qubits whose errors cancel each other out. And a quantum computer capable of cracking encryption would require thousands of logical qubits. Last time I checked we had somewhere near 72 physical qubits available to us and that computer was pretty unstable.

Another problem is the condition of isolating the system in which these qubits need superconductors to work on and superconductors needs to be maintained at 15 millikelvin to work as a superconductor and thus the giant quantum computer you see normally is mostly for maintaining this temperature and for sending microwave pulses some set up which is very hard a task.

### Where does Machine Learning, AI come into Play?

Quantum devices can be used to accelerate machine learning. Current quantum technology resembles special-purpose hardware like ASCI’s rather than the general-purpose CPU. They are hard-wired to implement a limited class of quantum algorithms. More advanced devices can be programmed to run simple quantum circuits just like FGPA’s. We know that both ASCI’s and FPGA offers benefits in machine learning as well as GPU, CPU, and TPU. Therefore a quantum could theoretically be added to this mix of specialized ai hardware to help us advance against AGIs by creating an entirely new tool of machine learning. Just like how GPU’s contributed to Deep Learning renaissance around a decade ago.

**Some Tasks that Machine Learning Can Leverage from this Amazing System:**

**Linear Algebra:**One of the bottlenecks to solving linear algebraic equations is data encoding. For using a quantum computer as a kind of super-fast linear algebra solver for large matrix multiplications and eigendecompositions (not unlike TPUs), we have to first “load” the large matrix onto the quantum device, a highly non-trivial procedure, although quantum gate executes a multiplication of an exponentially — or even infinitely. Complex linear algebra computations — namely those corresponding to quantum gates — can be therefore be done in a single operation on a quantum computer. Thus treating a Quantum gate as a highly structured linear layer of Neural network.- Optimization a hybrid quantum-classical technique of variational circuits has been proposed. There, a quantum device is used to evaluate a hard-to-compute cost function, while a classical device performs an optimization based on this information.
- The sampling-every quantum computer is fundamentally a sampler that starts with a simple probability distribution over all possible measurement outcomes, computes a more complicated distribution, and samples an outcome via a measurement. Quantum devices are therefore interesting assistants for sampling-based training, for example with Boltzmann machines. In short, a very promising avenue is, therefore, to explore how samples from quantum devices can be used to train machine learning models.
**Kernel Evaluation:**The idea of “quantum kernels” is to use the quantum device only to compute kernels of data points, by estimating the inner product of two very high-dimensional quantum states. The kernel estimates can then be fed into a classical machine learning model for training and prediction. In other words, the estimates from the quantum computer can be fed into a standard kernel method — such as a support vector machine. Inference and training are done purely classically but augmented with the quantum special-purpose device. Quantum devices can be used to estimate certain kernels, including ones that are difficult to compute classically. It can be used for deciding the kernel, sampling and for optimization purposes.

This is what Sundar Pichai has to say about AI and quantum computing working together in an exclusive interview with MIT Technology Review-“I think it’ll be a very powerful symbiotic thing. Both fields are in the early phases. There is exciting work in AI in terms of building larger models, more generalizable models, and what kind of computing resources you need to get there. We think AI can accelerate quantum computing and quantum computing can accelerate AI. And collectively, we think it’s what we would need to, down the line, solve some of the most intractable problems we face, like climate change.”

**Notions:**

- People say that blockchain will be destroyed and the security will be a myth. Well, that is not completely true. I admit that some of the algorithms used widely will be crackable but there are other algorithms(Quantum Algorithm) that can be used to encrypt the data properly on which even a Quantum computer will cause no harm. So you guys are safe as long as there are still some people to implement these algorithms.
- There is also this notion that Quantum supremacy is the beginning of the end as we will be easily able to develop a general-purpose AI. Well, we are years away from building a gAI even if we consider all the problems faced by a Quantum computer is solved and this area needs research in itself. A quantum computer may accelerate things pretty quicky but a general purposed ai is still decades away from us.

**The Possibilities?**

A company seeking to find the ideal route for retail deliveries could split the problem into two parts and leverage each computer for its strengths. As a data scientist, we can look at some of the algorithms and contribute some more to it. The possibilities are virtually endless.

*TL;DR* Quantum computing can be used for solving the problem that sincerely needs their attention. That is, the big guy must only deal with the worst problems and the trivial ones must be taken care of by classical computing methods. This means Quantum computer will never replace but work side by side with classical computers and will contribute to making our lives better by solving problems considered inexplicable until now.

## Recommended Posts:

- Everything You Need to Know About Google's Quantum Supremacy
- Who Will Win The Quantum Supremacy Debate: Google or IBM?
- Difference Between Data Science and Data Mining
- Difference Between Data Science and Data Engineering
- Difference Between Big Data and Data Science
- Overview of Data Science
- Introduction to Data Science
- Data Science Process
- How to Get Masters in Data Science in 2020?
- Structure of Data Science Project
- Python IDEs For Data Science
- Top Data Science Trends You Must Know in 2020
- Machine Learning and Data Science
- Data Science Methodology and Approach
- 11 Industries That Benefits the Most From Data Science
- Top 10 R Libraries for Data Science in 2020
- Difference between Data Science and Machine Learning
- Difference Between Data Science and Software Engineering
- Top 10 Python Libraries for Data Science in 2020
- Difference Between Data Science and Business Intelligence

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.