Jaemoo Choi
Postdoc @ Georgia Tech
Hi! I’m a postdoctoral researcher at Georgia Tech, working at FLAIR Lab, with Prof. Yongxin Chen. I am very fortunate to be part of his group and to work with a supportive and inspiring community of researchers. I also have the pleasure of collaborating actively with Prof. Joonseok Lee and his research group. Prior to that, I received my Ph.D. in Mathematical Sciences and B.S. in Mathematics Education from Seoul National University.
My research focuses on the fundamental algorithms for generative AI (GenAI) and their applications across vision, language, and scientific domains (e.g., molecular generation). I am particularly interested in (discrete) diffusion models, flow-based methods, and large language models (LLMs) and its applications, as well as their connections to control and dynamical systems. Now, I am actively seeking full-time opportunities.
Contact:
jchoi843 [at] gatech [dot] edu
Follow:
Google Scholar
|
jaemoo-choi
Updates
| [02/2026] New preprints on Fine-tuning Diffusion Models (Rethinking RL4DM) and LLM (QUATRO) have recently been released! |
| [02/2026] New preprints on discrete diffusion models DASBS and GSBoG have recently been released! |
| [02/2026] A new preprint on training and distillation of diffusion models, titled ASBM, has recently been released! |
| [01/2026] PDNS is accepted to ICLR. |
| [09/2025] Adjoint Schrödinger Bridge Sampler (ASBS) accepted to NeurIPS (Oral 0.3%). |
| [09/2025] Three papers, ASBS, NAAS, MDNS, accepted to NeurIPS. |
| [05/2025] Two papers, OTP, UOT-UPC, accepted to ICML. |
| [01/2025] Two papers, U-NOTB, DIOTM, accepted to ICLR. |
| [09/2024] I have joined FLAIR, Georgia Tech as a Postdoc. |
| [05/2024] Scalable Wasserstein Gradient Flow for Generative Modeling through Unbalanced Optimal Transport is accepted to ICML. |
| [01/2024] Analyzing and Improving Optimal-Transport-based Adversarial Networks is accepted to ICLR. |
| [09/2023] Unbalanced Optimal Transport Model (UOTM) is accepted to NeurIPS. |
Selected Publications
- Rethinking the Design Space of Reinforcement Learning for Diffusion Models: On the Importance of Likelihood Estimation Beyond Loss Designpreprint, 2026
- Efficient Generative Modeling beyond Memoryless Diffusion via Adjoint Schrödinger Bridge Matchingpreprint, 2026
- Proximal Diffusion Neural SamplerInternational Conference on Learning Representations (ICLR), 2026
- Adjoint Schrödinger Bridge SamplerAdvances in Neural Information Processing Systems (NeurIPS), 2025 [Oral 0.3%]
- Non-equilibrium Annealed Adjoint SamplerAdvances in Neural Information Processing Systems (NeurIPS), 2025
- Overcoming Spurious Solutions in Semi-Dual Neural Optimal Transport: A Smoothing Approach for Learning the Optimal Transport PlanInternational Conference on Machine Learning (ICML), 2025
- Improving Neural Optimal Transport via Displacement InterpolationInternational Conference on Learning Representations (ICLR), 2025