alt text 

Zhihui Zhu
Assistant Professor
Computer Science and Engineering
The Ohio State University


583 Dreese Lab
2015 Neil Avenue
Columbus, OH 43210


Phone:
Email: zhu.3440@osu.edu


We are seeking Ph.D. students in Machine Learning (particularly analyses and practical techniques for deep learning, generative models, and LLMs), Signal Processing, and Quantum Information and Computing. Welcome candidates from any background, e.g., EECS, math, physics.

Call for papers: Conference on the Mathematical Theory of Deep Neural Networks, manuscript due: November 14-15, 2024, Philadelphia

Call for papers: Conference on Parsimony and Learning (CPAL), March 2025, Stanford

News:

  • [May 2024] Thrilled to receive the ORAU Ralph E. Powe Junior Faculty Enhancement Award.

  • [May 2024] Two papers accepted to ICML 2024.

  • [Jun 2023] Our collaborative proposal (with Jere at JHU and Qing at UMich) on Deep Neural Collapse has been awarded by NSF!

  • [Jan 2023] Co-organized and gave a tutorail at the 3rd SLowDNN Workshop at MBZUAI, Abu Dhabi, MBZUAI.

  • [Dec 2022] Received CQISE Partnership Seed Award (PSA) and will collaborate with Brian Kirby (ARL) on quantum network.

  • [Dec 2022] Invited to serve as area chair at ICML 2023.

  • [Nov 2022] Elected to serve on the Technical Committee of the Machine Learning for Signal Processing (MLSP TC) under the IEEE Signal Processing Society.

  • [August 2022] Our group joined the Department of Computer Science and Engineering at The Ohio State University.

  • [May 2022] On May 26-27, together with Jere's group, we had our annual Deep & Sparse Team meeting at the University of Denver to discuss project accomplishments and plans.

  • [May 2021] Invited to serve as a TPC member (area chair) at NeurIPS 2022.

  • [May 2021] One ICML’21 on robust subspace learned accepted. New paper released on understanding the behabior of the classifiers in monder deep neural networks.

  • [May 2021] Our collaborative proposal (with Mike and Gongguo at CSM) on Structured Inference and Adaptive Measurement Design has been awarded by NSF!

  • [March 2021] Invited to serve as a TPC member (area chair) at NeurIPS 2021.

  • [Sep 2020] Our paper has been accepted at NeurIPS as spotlight (top 4%), which characterizes implicit bias with discrepant learning rates and builds connections between over-parameterization, RPCA, and deep neural networks.

  • [Jun 2020] Our proposal (with Jere at JHU) ‘‘Collaborative Research: CIF: Small: Deep Sparse Models: Analysis and Algorithms’’ has been awarded by NSF!

  • [Jun 2020] Two papers about over-parameterization are on arXiv: one studies the benefit of over-realized model in dictionary learning, another one characterizes implicit bias with discrepant learning rates and builds connection between over-parameterization, RPCA, and deep neural networks.

  • [Feb 2020] Our paper on robust homography estimation has been accepted to CVPR 2020.

  • [Jan 2020] Co-organized with Qing and Shuyang, our two-session mini-symposium ‘‘Recent Advances in Optimization Methods for Signal Processing and Machine Learning’’ has been accepted by the inaugural SIAM Conference on Mathematics of Data Science. See you at Cincinnati, Ohio in May!

  • [Jan 2020] Invited talk at Colorado School of Mines.

  • [Nov 2019] Our paper (with Xiao, Shixiang, Zengde, Qing, and Anthony) ‘‘Nonsmooth Optimization over Stiefel Manifold: Riemannian Subgradient Methods’’ is on arxiv. This work provides (first) explicit convergence rate guarantees for a family of Riemannian subgradient methods when used to optimize nonsmooth functions (that are weakly convex in the Euclidean space) over then Stiefel manifold.

  • [Oct 2019] Attended the Northrop Grumman University Research Symposium, and presented our work on ‘‘Object Identification with Less Supervision".

  • [Oct 2019] Attended the Computational Imaging workshop at IMA, University of Minnesota, and presented our work on ‘‘A Linearly Convergent Method for Non-smooth Non-convex Optimization on Grassmannian with Applications to Robust Subspace and Dictionary Learning’’.

  • [Aug 2019] Our paper (with Xiao, Anthony, Jason) ‘‘Incremental Methods for Weakly Convex Optimization’’ is on arxiv. This work provides (first) convergence guarantee for incrememtal algorithms and their random shuffling version (including the incremental subgradient method which is the work-horse of deep learning) in solving weakly convex optimization problems which could be nonconvex and nonsmooth.