I am a postdoctoral researcher at Empirical Inference Department, Max Planck Institute for Intelligent Systems working with Bernhard Schölkopf. I obtained my Ph.D. in Computer Science from Tsinghua University, advised by Jun Zhu and Bo Zhang in December 2020. My research interests are in data-efficient machine learning. I have worked on topics including semi-supervised learning, deep generative models, and approximate inference. I received my B.E. degree from Department of Electronical Engineering at Tsinghua University in 2015. I also received a Bachelor’s degree in Economics from National School of Development of Peking University.

I was a research scientist intern at DeepMind in 2021, working with Silvia Chiappa on conformal inference of individual treatment effects. I was a visiting student at Princeton University working with Ryan P. Adams in 2019. I also spent time at RIKEN-AIP, Tokyo and Google Cloud AI, Beijing as a research intern during my PhD.


Learning Counterfactually Invariant Predictors
Francesco Quinzan, Cecilia Casolo, Krikamol Muandet, Niki Kilbertus, Yucen Luo
Improving Generative Moment Matching Networks with Distribution Partition
Yong Ren, Yucen Luo, Jun Zhu
Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI), 2021.
SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models
Yucen Luo*, Alex Beatson, Mohammad Norouzi, Jun Zhu, David Duvenaud, Ryan P. Adams, Ricky T. Q. Chen*
International Conference on Learning Representations (ICLR), 2020 (Spotlight).
DBSN: Measuring Uncertainty through Bayesian Learning of Deep Neural Network Structures
Zhijie Deng, Yucen Luo, Jun Zhu, Bo Zhang
2nd Workshop on Neural Architecture Search at ICLR 2021.
Cluster Alignment with a Teacher for Unsupervised Domain Adaptation [code]
Zhijie Deng, Yucen Luo, Jun Zhu
International Conference on Computer Vision (ICCV), 2019.
Semi-crowdsourced Clustering with Deep Generative Models [code, poster]
Yucen Luo, Tian Tian, Jiaxin Shi, Jun Zhu, Bo Zhang
Advances in Neural Information Processing Systems (NeurIPS), 2018.
Initial version in ICML 2018 Workshop on Theoretical Foundations and Applications of Deep Generative Models.
Smooth Neighbors on Teacher Graphs for Semi-supervised Learning [code, poster]
Yucen Luo, Jun Zhu, Mengxi Li, Yong Ren, Bo Zhang
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018 (Spotlight).
Best paper award at NeurIPS 2017 Workshop on Learning with Limited Labeled Data .
ZhuSuan: A Library for Bayesian Deep Learning [code]
Jiaxin Shi, Jianfei Chen, Jun Zhu, Shengyang Sun, Yucen Luo, Yihong Gu, Yuhao Zhou
arXiv 1709.05870, 2017.
Conditional Generative Moment-Matching Networks [code]
Yong Ren, Jialian Li, Yucen Luo, Jun Zhu
Advances in Neural Information Processing Systems (NeurIPS), 2016.