Tianyu Wang

I'm an assistant professor at Shanghai Center for Mathematical Science (SCMS), Fudan University. Before joining Fudan, I obtained my doctorate degree from Duke University and my bachelor's degree from the Hong Kong University of Science and Technology (HKUST).

Email: wangtianyu AT fudan DOT edu DOT cn

Research interests (on a very coarse grid, I only work on a small subset of each of the following): applied harmonic analysis, machine learning, mathematical statistics, operations research, optimization.

Research interests (on a fine grid): I mainly work on bandit learning problems in metric spaces, and zeroth-order optimization problems. My research also intersects with compressed sensing and smooth convex optimization algorithms, and touches upon random matrix theory.

Students: I am fortunate to work with the following students: Yasong Feng, Yu Liu, Yunlu Shu, Zicheng Wang

My publication list is here.

Some Recent Papers/Preprints

Zeroth-order Low-rank Hessian Estimation via Matrix Recovery, Preprint

Tianyu Wang, Zicheng Wang, Jiajia Yu

The Anytime Convergence of Stochastic Gradient Descent with Momentum: From a Continuous-Time Perspective , Preprint

Yasong Feng, Yifang Jiang, Tianyu Wang, Zhiliang Ying

A Lipschitz Bandits Approach for Continuous Hyperparameter Optimization, Preprint

Yasong Feng, Weijian Luo, Yimin Huang, Tianyu Wang

Convergence Rates of Zeroth-order Gradient Descent for Lojasiewicz Functions, Accepted to INFORMS Journal on Computing

Tianyu Wang and Yasong Feng

Yasong Feng and Tianyu Wang, “Stochastic zeroth-order gradient and Hessian estimators: variance reduction and refined bias bounds,” Information and Inference: A Journal of the IMA, vol. 12, no. 3, 2023
Tianyu Wang, “On sharp stochastic zeroth-order Hessian estimators over Riemannian manifolds,” Information and Inference: A Journal of the IMA, vol. 12, no. 2, pp. 787–813, 2023
Yasong Feng, Zengfeng Huang, and Tianyu Wang, “Lipschitz bandits with batched feedback,” in Advances in Neural Information Processing Systems, 2022, pp. 19836–19848.

--Spotlight Presentation, long version accepted to IEEE Trans. Inf. Theory