Machine Learning (ML) aims to systematically devise algorithms for machines to infer the input-output relationship of an unknown function from historic data. The key elements in ML are an input space X, an output space Y, and a collection of functions, denoted as the function class F. The learning machine seeks to select a function f : X → Y from F that approximates the unknown function from the training data set {(X_i,Y_i)}, which results from identically and independently drawn from some measure μ on X. The two main branches of ML are: (i) Sample complexity, which quantifies the "effective size" of the function class. In other words, sample complexity aims to determine the information-theoretic efficiency of learning an element in the function class F. (ii) Computational complexity, which measures the efficiency, e.g., the running time, of a learning algorithm.
In the area of quantum machine learning, the main research focuses are the following three parts (see Figure 1).
Task 1. Determine the optimal size of the training data: This task will require to develop a powerful matrix concentration inequality to bound the size of a training data set generated by unknown matrix-valued functions. Its outcome will be of independent interest in random matrix theory, probability and statistics.
Task 2. Design efficient quantum learning algorithms: Quantum computing is a brand-new framework of computation. As a result, we need to identify learning problems that can be efficiently solved, and invent efficient quantum learning algorithms for these problems.
Task 3. Identify quantum advantages: This task aims to identify how two quantum resources, quantum entanglement and superposition, can achieve improvement in quantum learning environments.
The first two tasks are naturally extension of their classical counterparts. While the third one aims to understand how quantum mechanics can facilitate machine learning problems.
The following are my research outcomes in this topic.
Paper
Kaining Zhang, Min-Hsiu Hsieh, Liu Liu, Dacheng Tao. Efficient State Read-out for Quantum Machine Learning Algorithms. arXiv:2004.06421(2020).
Yuxuan Du, Min-Hsiu Hsieh, Tongliang Liu, Dacheng Tao, Nana Liu. Quantum noise protects quantum classifiers against adversaries. arXiv:2003.09416(2020).
Xi He, Chufan Lyu, Min-Hsiu Hsieh, Xiaoting Wang. Quantum transfer component analysis for domain adaptation. arXiv:1912.09113(2019).
Kaining Zhang, Min-Hsiu Hsieh, Liu Liu, Dacheng Tao. Quantum algorithm for finding the negative curvature direction in non-convex optimization. arXiv:1909.07622(2019).
Sathyawageeswar Subramanian, Min-Hsiu Hsieh. Quantum algorithm for estimating Renyi entropies of quantum states. arXiv:1908.05251(2019).
Yuxuan Du,Min-Hsiu Hsieh, Tongliang Liu, Dacheng Tao. A Quantum-inspired Algorithm for General Minimum Conical Hull Problems. arXiv:1907.06814(2019).
Yuxuan Du, Min-Hsiu Hsieh, Dacheng Tao. Efficient Online Quantum Generative Adversarial Learning Algorithms with Applications. arXiv:1904.09602(2019).
Ximing Wang,Yuechi Ma,Min-Hsiu Hsieh, Manhong Yung. Quantum Speedup in Adaptive Boosting of Binary Classification. arXiv:1902.00869(2019).
Yuxuan Du, Min-Hsiu Hsieh, Tongliang Liu, Dacheng Tao. The Expressive Power of Parameterized Quantum Circuits. accepted inPhysical Review Researchon May 18, 2020 [arXiv:1810.11922].
Yuxuan Du, Min-Hsiu Hsieh, Tongliang Liu, Dacheng Tao. Implementable Quantum Classifier for Nonlinear Data. arXiv:1809.06056(2018).
Hao-Chung Cheng,Min-Hsiu Hsieh, Ping-Cheng Yeh. The learnability of unknown quantum measurements. Quantum Information & Computation, vol. 16, no. 7&8, pp. 615–656 (2016).
Comments