Qbit Calcolo Quantistico
The Present and Future of Quantum Computing for AI
Quantum computing is still in it’s infancy, and no universal architecture for quantum computers exists right now. However, their prototypes are already here and showing promising results in cryptography, logistics, modelling and optimization tasks. For AI researchers optimization and sampling is particularly important, because it allows to train Machine Learning models much faster with higher accuracy.
At the present time, Canadian D-Wave is the leading company in quantum computing. Their latest machine D-Wave 2000Q contains 2000 qubits operating at a temperature of 0.015K (it is slightly higher than absolute zero). They are not aiming at creating Universal Quantum Computer in the near future, but one thing their devices can do quite well — Quantum Annealing.
D-Wave 2000Q works on the scale of milliseconds to upload input data, find the solution and read it out. You can easily repeat the whole process many times to get different solutions. And it works up to thousands of times faster than modern GPU implementations of Simulated Annealing.
Quantum Annealing fits quite well for training and sampling from energy-based models like Boltzmann Machines. Notably, Unsupervised Learning is a big challenge for AI researchers, and possibly quantum computing is the key. However, there are problems with numerical precision. It’s hard to deal even with half-precision floats, so at the moment most of the work is going with binary variables.
D-Wave is planning to create industry-ready hybrid quantum/classic computers for Machine Learning by 2019. Additionally, 1QBit is already developing specialized software for their machines.
How QA Works
Briefly, Quantum Annealing is a method of controlled energy reduction of the quantum system that moves qubits from superposition to a classical state with low energy configuration. Task description is encoded as the energy function in connections between qubits, and through annealing they are moving towards some optimal configuration.
If the transition is carried out slowly enough the algorithm will find a ground state (i.e., an optimal solution) with high probability:
During the annealing process, probability of qubits ending up in the minimum energy state increases
Quantum Coupling allows qubits to explore all potential solutions simultaneously, and at the same time Quantum Tunneling allows them to move through high energy barriers towards the “better” states. These 2 effects allow quantum computers to solve many hard optimization problems much faster than classic computers. This video by D-Wave explains QA in more details:
Another major player is IBM Q. Big Blue is working with Gate-model quantum computing and their machines are Universal Quantum Computers. They have much broader set of applications, but at the same time they are much harder to control. State-of-the-art processors from IBM have 16 and 17 qubits, and it’s really hard to scale further.
More general architecture of IBM’s processors allows them to run any quantum algorithms. For example, Grover’s algorithm can find an input for a black-box function that produces specified output in only O(√N) evaluations of the function. Not to mention Shor’s algorithm for integer factorization, which created a lot of worries about security of many classical encryption algorithms.
By the way, 16-qubit version is publicly available through IBM Q Experience program. Reputation of cognitive services from IBM Watson in AI community is quite bad now. Maybe, IBM Q will be able to change this situation.
One more thing at the intersection of Quantum world and AI — Quantum Neural Network is a kind of inherently stochastic modification of a classic artificial neural network. It is an interesting research direction, but nothing meaningful was accomplished yet. Only theoretical research and simulations on toy problems.
Overall, quantum computing looks like a promising direction for stochastic models in Machine Learning. With recent progress from D-Wave and IBM, I think we can expect real-world applications of quantum computers in AI by 2020.
Originally published at Cognitive Chaos.