Zhou Lu(陆洲)

alt text 

CS PhD student
Computer Science Department
Princeton University
E-mail: zhoul@princeton.edu

About me

I am a second-year PhD student in the CS Department at Princeton University, advised by Prof. Elad Hazan. Before Princeton, I received B.S. in Statistics from Peking University in 2018, where I was fortunate to work with Prof. Liwei Wang.

Research

My research interests lie in machine learning theory, especially expressivity of neural networks and online learning. I am a fan of constructive proofs in proving bounds.

Preprints

  1. Zhou Lu, Wenhan Xia, Sanjeev Arora, and Elad Hazan, "Adaptive Gradient Methods with Local Guarantees".[pdf]

  2. Daogao Liu and Zhou Lu. "The Convergence Rate of SGD's Final Iterate: Analysis on Dimension Dependence".[pdf] (Partially resolved a COLT 2020 open problem)

  3. Daogao Liu and Zhou Lu. "Curse of Dimensionality in Unconstrained Private Convex ERM". [pdf]

  4. Qinghua Liu and Zhou Lu. "A Tight Lower Bound for Uniformly Stable Algorithms". [pdf]

Publications

  1. Bohang Zhang, Tianle Cai, Zhou Lu, Di He and Liwei Wang. "Towards Certifying ℓ∞ Robustness using Neural Networks with ℓ∞-dist Neurons", in ICML 2021. [pdf]

  2. Naman Agarwal, Nataly Brukhim, Elad Hazan and Zhou Lu, "Boosting for Control of Dynamical Systems", in ICML 2020.[pdf]

  3. Zhou Lu, Hongming Pu, Feicheng Wang, Zhiqiang Hu and Liwei Wang, " The expressive power of neural networks: A view from the width ", in NIPS 2017. [pdf]

Not Intended to Publish

  1. Zhou Lu. "A Note on the Representation Power of GHHs". [pdf]

  2. Zhou Lu. "A Note on John Simplex with Positive Dilation". [pdf]