Zhou Lu(陆洲)
About me
I am a second-year PhD student in the CS Department at Princeton University, advised by Prof. Elad Hazan. Before Princeton, I received B.S. in Statistics from Peking University in 2018, where I was fortunate to work with Prof. Liwei Wang.
Research
My research interests lie in machine learning theory, especially expressivity of neural networks and online learning. I am a fan of constructive proofs in proving bounds.
Preprints
Zhou Lu, Wenhan Xia, Sanjeev Arora, and Elad Hazan, "Adaptive Gradient Methods with Local Guarantees".[pdf]
Daogao Liu and Zhou Lu. "The Convergence Rate of SGD's Final Iterate: Analysis on Dimension Dependence".[pdf] (Partially resolved a COLT 2020 open problem)
Daogao Liu and Zhou Lu. "Curse of Dimensionality in Unconstrained Private Convex ERM". [pdf]
Qinghua Liu and Zhou Lu. "A Tight Lower Bound for Uniformly Stable Algorithms". [pdf]
Publications
Bohang Zhang, Tianle Cai, Zhou Lu, Di He and Liwei Wang. "Towards Certifying ℓ∞ Robustness using Neural Networks with ℓ∞-dist Neurons", in ICML 2021. [pdf]
Naman Agarwal, Nataly Brukhim, Elad Hazan and Zhou Lu, "Boosting for Control of Dynamical Systems", in ICML 2020.[pdf]
Zhou Lu, Hongming Pu, Feicheng Wang, Zhiqiang Hu and Liwei Wang, " The expressive power of neural networks: A view from the width ", in NIPS 2017. [pdf]
Not Intended to Publish
Zhou Lu. "A Note on the Representation Power of GHHs". [pdf]
Zhou Lu. "A Note on John Simplex with Positive Dilation". [pdf]
|