Machine Learning (ML)* *aims to systematically devise algorithms for machines to infer the input-output relationship of an unknown function from historic data. The key elements in ML are an input space* *X, an output space* *Y, and a collection of functions, denoted as the function class* ***F**. The learning machine seeks to select a function f : X → Y from **F** that approximates the unknown function from the training data set {(X_i,Y_i)}, which results from identically and independently drawn from some measure μ on X. The two main branches of ML are: (i) *Sample complexity*, which quantifies the "effective size" of the function class. In other words, sample complexity aims to determine the information-theoretic* *efficiency of learning an element in the function class **F**. (ii) *Computational complexity*, which measures the efficiency, e.g., the running time, of a learning algorithm.

In the area of quantum machine learning, the main research focuses are the following three parts (see Figure 1).

**Task 1. Determine the optimal size of the training data:**This task will require to develop a powerful matrix concentration inequality to bound the size of a training data set generated by unknown matrix-valued functions. Its outcome will be of independent interest in random matrix theory, probability and statistics.**Task 2. Design efficient quantum learning algorithms:**Quantum computing is a brand-new framework of computation. As a result, we need to identify learning problems that can be efficiently solved, and invent efficient quantum learning algorithms for these problems.**Task 3. Identify quantum advantages:**This task aims to identify how two quantum resources, quantum entanglement and superposition, can achieve improvement in quantum learning environments.

The first two tasks are naturally extension of their classical counterparts. While the third one aims to understand how quantum mechanics can facilitate machine learning problems.

The following are my research outcomes in this topic.

`Paper`

__Kaining Zhang__,**Min-Hsiu Hsieh**, Liu Liu, Dacheng Tao. Efficient State Read-out for Quantum Machine Learning Algorithms.__arXiv:2004.06421__(2020).__Yuxuan Du__,**Min-Hsiu Hsieh**, Tongliang Liu, Dacheng Tao, Nana Liu. Quantum noise protects quantum classifiers against adversaries.__arXiv:2003.09416__(2020).__Xi He__, Chufan Lyu,**Min-Hsiu Hsieh**, Xiaoting Wang. Quantum transfer component analysis for domain adaptation.__arXiv:1912.09113__(2019).__Kaining Zhang__,**Min-Hsiu Hsieh**, Liu Liu, Dacheng Tao. Quantum algorithm for finding the negative curvature direction in non-convex optimization.__arXiv:1909.07622__(2019).__Sathyawageeswar Subramanian__,**Min-Hsiu Hsieh**. Quantum algorithm for estimating Renyi entropies of quantum states.__arXiv:1908.05251__(2019).__Yuxuan Du__,**Min-Hsiu Hsieh**, Tongliang Liu, Dacheng Tao. A Quantum-inspired Algorithm for General Minimum Conical Hull Problems.__arXiv:1907.06814__(2019).__Yuxuan Du__,**Min-Hsiu Hsieh**, Dacheng Tao. Efficient Online Quantum Generative Adversarial Learning Algorithms with Applications.__arXiv:1904.09602__(2019).__Ximing Wang,Yuechi Ma__,**Min-Hsiu Hsieh**, Manhong Yung. Quantum Speedup in Adaptive Boosting of Binary Classification.__arXiv:1902.00869__(2019).__Yuxuan Du__,**Min-Hsiu Hsieh**, Tongliang Liu, Dacheng Tao. The Expressive Power of Parameterized Quantum Circuits. accepted in*Physical Review Research*on May 18, 2020 [arXiv:1810.11922].__Yuxuan Du__,**Min-Hsiu Hsieh**, Tongliang Liu, Dacheng Tao. Implementable Quantum Classifier for Nonlinear Data.__arXiv:1809.06056__(2018).__Hao-Chung Cheng__,**Min-Hsiu Hsieh**, Ping-Cheng Yeh. The learnability of unknown quantum measurements.*Quantum Information & Computation,*vol. 16, no. 7&8, pp. 615–656 (2016).

## Comments