EVENT CANCELLED 

Event Details

CSIP Seminar: Normalizing flow neural networks by JKO scheme

Time: Friday, Jan 19th, 3:00 PM 

Location: In-person, CSIP Library (Room 5126), 5th floor, Centergy one 

Zoom link: https://gatech.zoom.us/j/99964329744

Speaker: Chen Xu

 

Normalizing flow neural networks by JKO scheme

Abstract

Normalizing flow is a class of deep generative models for efficient sampling and likelihood estimation, which achieves attractive performance, particularly in high dimensions. The flow is often implemented using a sequence of invertible residual blocks. Existing works adopt special network architectures and regularization of flow trajectories. In this paper, we develop a neural ODE flow network called JKO-iFlow, inspired by the Jordan-Kinderleherer-Otto (JKO) scheme, which unfolds the discrete-time dynamic of the Wasserstein gradient flow. The proposed method stacks residual blocks one after another, allowing efficient block-wise training of the residual blocks, avoiding sampling SDE trajectories and score matching or variational learning, thus reducing the memory load and difficulty in end-to-end training. We also develop adaptive time reparameterization of the flow network with a progressive refinement of the induced trajectory in probability space to improve the model accuracy further. Experiments with synthetic and real data show that the proposed JKO-iFlow network achieves competitive performance compared with existing flow and diffusion models at a significantly reduced computational and memory cost.

 

Bio 

Chen Xu is currently a fourth-year Operations Research PhD in ISyE, where he is supervised by Prof. Yao Xie. His current research interests are two-fold. (1) Uncertainty quantification for machine learning models. Specifically, advance conformal prediction as a distribution-free method for arbitrarily complex deep models, especially in the context of time-series modeling. (2) Generative models through flow-based neural networks. Specifically, develop scalable computational tools for problems at the intersection of statistics and optimization, including extensions to high-dimensional optimal transport, distributionally robust optimization, and differential privacy. His works have appeared in top machine learning conferences (e.g., ICML 2021 oral, NeurIPS 2023 spotlight) and journals (e.g., IEEE TPAMI 2023, IEEE JSAIT).