I am looking for self-motivated students who are interested in data science, machine learning, and signal processing. If you are interested in joining my lab, please send me your CV, transcripts, and any other demonstration materials.
Our group works on theoretical and computational aspects of data science and machine learning, with a focus on developing efficient methods for extracting useful information from large-scale and high-dimensional data, proving their correctness, and applying them to solving real-world problems. A few current projects include
Machine learning: representation learning; generalization properties and implicit regularization; model (neural network) compression; deep neural networks for unsupervised learning and inverse problems;
Quantum information: statistical and algorithmic aspects of quantum information;
Optimization: Nonconvex geometric analysis; distributed optimization; the design, analysis, and implementation of large-scale optimization algorithms for engineering problems.
News:
[Sep 2020] Our paper has been accepted at NeurIPS as spotlight (top 4%), which characterizes implicit bias with discrepant learning rates and builds connections between over-parameterization, RPCA, and deep neural networks.
[Jun 2020] Two papers about over-parameterization are on arXiv: one studies the benefit of over-realized model in dictionary learning, another one characterizes implicit bias with discrepant learning rates and builds connection between over-parameterization, RPCA, and deep neural networks.
[Jan 2020] Co-organized with Qing and Shuyang, our two-session mini-symposium ‘‘Recent Advances in Optimization Methods for Signal Processing and Machine Learning’’ has been accepted by the inaugural SIAM Conference on Mathematics of Data Science. See you at Cincinnati, Ohio in May!
[Nov 2019] Our paper (with Xiao, Shixiang, Zengde, Qing, and Anthony) ‘‘Nonsmooth Optimization over Stiefel Manifold: Riemannian Subgradient Methods’’ is on arxiv. This work provides (first) explicit convergence rate guarantees for a family of Riemannian subgradient methods when used to optimize nonsmooth functions (that are weakly convex in the Euclidean space) over then Stiefel manifold.
[Oct 2019] Attended the Computational Imaging workshop at IMA, University of Minnesota, and presented our work on ‘‘A Linearly Convergent Method for Non-smooth Non-convex Optimization on Grassmannian with Applications to Robust Subspace and Dictionary Learning’’.
[Aug 2019] Our paper (with Xiao, Anthony, Jason) ‘‘Incremental Methods for Weakly Convex Optimization’’ is on arxiv. This work provides (first) convergence guarantee for incrememtal algorithms and their random shuffling version (including the incremental subgradient method which is the work-horse of deep learning) in solving weakly convex optimization problems which could be nonconvex and nonsmooth.
|