Quantum Optimisation and Machine Learning
Optimisation problems occur frequently in real-world settings, for example in logistics where the most efficient route between locations needs to be found, and these are a likely application of quantum computing technology. NQIT is expanding to incorporate the solution of optimisation problems, to better position us as a source of information for practical applications of quantum information technology, and a trusted authority on quantum technologies in general.
Machine learning is one example of an approach to solving complex problems, which recent developments suggest can benefit from the use of quantum technology. To better investigate quantum applications to machine learning, we have recruited Dr Michael Gutmann, Senior Lecturer in Machine Learning at the University of Edinburgh and an expert in classical machine learning, who will lead the new work package on Quantum Optimisation and Machine Learning. Michael provides the background and expertise necessary to clarify the objectives that quantum technologies should target in machine learning, with others within NQIT providing the required expertise in quantum technologies.
The broad approach of this new work on machine learning will explore whether quantum computers can help machine learning systems to be more powerful. It fits into our existing quantum computing work in three main ways:
- Looking at whether certain quantum architectures are better than others for machine learning applications (WP0)
- Looking at potential applications of quantum machine learning for digital quantum simulation, and how these might be better at simulating how other systems work (WP7)
- Using quantum machine learning as an early application for our work on quantum/classical interfacing, since quantum machine learning will initially be a combination of small quantum devices coupled with big conventional computers (WP8)
Machine learning is often subject to a cost-quality or time-quality trade-off. Certain learning principles yield provably accurate results but are computationally infeasible. Even on modern computer clusters, you have to wait too long for a result, so that you settle for an approximate but computationally feasible solution. We would hope that quantum computers may allow us to strike a better trade-off than conventional computers.
An example where current machine learning faces limits is uncertainty quantification: we can use machine learning to make predictions from data, but these predictions are never clear-cut. A variety of different predictions may be in line with the data, and often, such as in a medical context, it is important to have an idea about how probable they are.