Quantum computers are becoming available, which begs the question: what are we going to use them for? Machine learning is a good candidate. In this course we will introduce several quantum machine learning algorithms and implement them in Python.
The pace of development in quantum computing mirrors the rapid advances made in machine learning and artificial intelligence. It is natural to ask whether quantum technologies could boost learning algorithms: this field of inquiry is called quantum-enhanced machine learning. The goal of this course is to show what benefits current and future quantum technologies can provide to machine learning, focusing on algorithms that are challenging with classical digital computers. We put a strong emphasis on implementing the protocols, using open source frameworks in Python. Prominent researchers in the field will give guest lectures to provide extra depth to each major topic. These guest lecturers include Alán Aspuru-Guzik, Seth Lloyd, Roger Melko, and Maria Schuld.In particular, we will address the following objectives:1) Understand the basics of quantum states as a generalization of classical probability distributions, their evolution in closed and open systems, and measurements as a form of sampling. Describe elementary classical and quantum many-body systems. 2) Contrast quantum computing paradigms and implementations. Recognize the limitations of current and near-future quantum technologies and the kind of the tasks where they outperform or are expected to outperform classical computers. Explain variational circuits.3) Describe and implement classical-quantum hybrid learning algorithms. Encode classical information in quantum systems. Perform discrete optimization in ensembles and unsupervised machine learning with different quantum computing paradigms. Sample quantum states for probabilistic models. Experiment with unusual kernel functions on quantum computers4) Demonstrate coherent quantum machine learning protocols and estimate their resources requirements. Summarize quantum Fourier transformation, quantum phase estimation and quantum matrix, and implement these algorithms. General linear algebra subroutines by quantum algorithms. Gaussian processes on a quantum computer.
Quantum Machine Learning
Course Topic
University, College, Institution
Course Language
Place of class
Online, self-paced (see curriculum for more information)
Degree
Certificate
Quantum Machine Learning
[display-frm-data id=”8278″ filter=”1″]
More classes & courses