Qianxiao Li

Assistant Professor at National University of Singapore

Schools

  • National University of Singapore

Links

Biography

National University of Singapore

Qianxiao Li is an assistant professor in the Department of Mathematics, National University of Singapore. He graduated with a BA in mathematics from University of Cambridge and a PhD in applied mathematics from Princeton University. His research interests include the interplay of machine learning and dynamical systems, stochastic gradient algorithms and the application of data-driven methods to scientific problems. He is a recipient of the NRF fellowship, class of 2021.

Research Areas

  • Machine Learning
  • Deep Learning
  • Numerical Analysis
  • Optimization
  • Control

Education

  • Doctor of Philosophy - PhD Princeton University (2011 — 2016)
  • Bachelor of Arts (B.A.) University of Cambridge (2007 — 2010)

Research Description

My research is on theoretical machine learning and its connections with numerical analysis, dynamical systems, and optimization/optimal control. I am also interested in developing novel applications of data-driven methods for scientific discovery.

Selected Publications

Li, Qianxiao, and Shuji Hao. “An Optimal Control Approach to Deep Learning and Applications to Discrete-Weight Neural Networks.” In Proceedings of the 35th International Conference on Machine Learning (ICML), 2018.

Li, Qianxiao, Cheng Tai, and Weinan E. “Stochastic Modified Equations and Adaptive Stochastic Gradient Algorithms.” In Proceedings of the 34th International Conference on Machine Learning (ICML), 2017.

Li, Qianxiao, Long Chen, Cheng Tai, and Weinan E. “Maximum Principle Based Algorithms for Deep Learning.” The Journal of Machine Learning Research 18, no. 1 (2018): 5998–6026.

Li, Qianxiao, Cheng Tai, and Weinan E. “Stochastic Modified Equations and Dynamics of Stochastic Gradient Algorithms I: Mathematical Foundations.” Journal of Machine Learning Research 20, no. 40 (2019): 1–47.

Cai, Yongqiang, Qianxiao Li, and Zuowei Shen. “A Quantitative Analysis of the Effect of Batch Normalization on Gradient Descent.” In International Conference on Machine Learning (ICML), 882–890, 2019.

Videos

Read about executive education

Other experts

Looking for an expert?

Contact us and we'll find the best option for you.

Something went wrong. We're trying to fix this error.