Optimization over the Probability Space via KL Divergence Gradient Descent: Theory and Applications in Statistics

Fri, 25 October, 2024 2:00pm - 3:00pm

Speaker: Rentian Yao, The University of British Columbia

Title: Optimization over the Probability Space via KL Divergence Gradient Descent: Theory and Applications in Statistics

Abstract: 

Optimization over probability spaces is fundamental to computational statistics. While recent approaches primarily focus on numerically implementing the Wasserstein gradient flow, the convergence of such algorithms heavily relies on displacement convexity, a condition that may not always be met in practice. Without this condition, the algorithms may require an exponential number of iterations to converge in worst-case scenarios. In this talk, I will present an alternative framework based on discretizing the Kullback--Leibler (KL) divergence gradient flow. This approach requires only standard convexity of the objective functional to ensure convergence. I will demonstrate the framework's utility through two applications. First, I will present the computation of nonparametric maximum likelihood estimation using a novel implicit KL proximal descent algorithm, implemented via normalizing flows Second, I will address the more complex problem of trajectory inference using a coordinate KL divergence gradient descent algorithm, which achieves computational tractability through particle methods. Both algorithms are proven to converge to global minima within polynomial iterations. If time permits, I will discuss the statistical properties of trajectory inference, providing practical insights for the design of biological experiments in this context.
 

Where
Duques Hall School of Business 2201 G Street, NW Washington DC 20052
Room: 152

Admission
Open to everyone.

Share This Event