Here's a link to my Google Scholar page.
Ph.D. Thesis
Saddle Avoidance, Asymptotic Normality, and Exponential Acceleration in Nonsmooth Optimization
Ph.D. Thesis, Cornell University, 2024 [pdf]
Preprints
Gradient descent with adaptive stepsize converges (nearly) linearly under fourth-order growth
Damek Davis*, Dmitriy Drusvyatskiy*, Liwei Jiang*
preprint [arxiv]
A validation approach to over-parameterized matrix and image recovery
Lijun Ding, Zhen Qin, Liwei Jiang, Jinxin Zhou, Zhihui Zhu
preprint [arxiv]
Journal Publications
Active manifolds, stratifications, and convergence to local minima in nonsmooth optimization
Damek Davis*, Dmitriy Drusvyatskiy*, Liwei Jiang*
Foundations of Computational Mathematics, to appear [arxiv]
Asymptotic normality and optimality in nonsmooth stochastic approximation
Damek Davis*, Dmitriy Drusvyatskiy*, Liwei Jiang*
The Annals of Statistics, 2024 [arxiv][journal]
INFORMS Optimization Society Student Paper Prize, Second Place[link]
A local nearly linearly convergent first-order method for nonsmooth functions with quadratic growth
Damek Davis*, Liwei Jiang*
Foundations of Computational Mathematics, 2024 [arxiv][journal]
Algorithmic regularization in model-free overparametrized asymmetric matrix factorization
Liwei Jiang, Yudong Chen, Lijun Ding
SIAM Journal on Mathematics of Data Science (SIMODS), 2023 [arxiv][journal]
On the translates of general dyadic systems on R
Theresa C Anderson*, Bingyang Hu*, Liwei Jiang*, Connor Olson*, Zeyu Wei*
Conference Publications
Rank overspecified robust matrix recovery: subgradient method and exact recovery
Lijun Ding*, Liwei Jiang*, Yudong Chen, Qing Qu, Zhihui Zhu
Neural Information Processing Systems Conference (NeurIPS), 2021 [arxiv][conference]
* above indicates alphabetical ordering.