I received my Ph.D. in Computer Science at Stanford University advised by Stefano Ermon, where I was affiliated with the SAIL and StatML groups. My research is centered around machine learning with limited labeled supervision, and is currently focused on developing techniques for better adaptation and controllability in deep generative models. I ground my methodological work in societal applications motivated by problems in personalization and fairness.

My research was supported by the NSF GRFP, Stanford Graduate Fellowship, the Qualcomm Innovation Fellowship, and the Two Sigma Diversity PhD Fellowship. I completed my undergraduate studies in CS-Stats at Columbia, where I worked on problems in computational biology as part of the Pe'er lab.

I previously interned at Google Brain in 2019 as part of the Magenta project. In my free time I'm an avid tennis player, runner, and food enthusiast!

Preprints

LMPriors: Pre-Trained Language Models as Task-Specific Priors.
Kristy Choi*, Chris Cundy*, Sanjari Srivastava, Stefano Ermon
Foundation Models for Decision Making Workshop, NeurIPS 2022.
[arXiv][code soon]

Publications

Neural Network Compression for Noisy Storage Devices.
Berivan Isik, Kristy Choi, Xin Zheng, Tsachy Weissman, Stefano Ermon, H.-S. Philip Wong, Armin Alaghi
ACM Transactions on Embedded Computing Systems, 2023.
[arXiv][code soon]
Concrete Score Matching: Generalized Score Matching for Discrete Data.
Chenlin Meng*, Kristy Choi*, Jiaming Song, Stefano Ermon
Neural Information Processing Systems (NeurIPS), 2022.
[pdf][code soon]
ButterflyFlow: Building Invertible Layers with Butterfly Matrices.
Chenlin Meng*, Linqi Zhou*, Kristy Choi*, Tri Dao, Stefano Ermon
International Conference of Machine Learning (ICML), 2022.
[pdf][code soon]
Density Ratio Estimation via Infinitesimal Classification.
Kristy Choi*, Chenlin Meng*, Yang Song, Stefano Ermon
International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.
Oral presentation [Top 2.6%]
[arXiv][code]
Featurized Density Ratio Estimation.
Kristy Choi*, Madeline Liao*, Stefano Ermon
Uncertainty in Artificial Intelligence (UAI), 2021.
[arXiv][code]
Robust Representation Learning via Perceptual Similarity Metrics.
Saeid Taghanaki*, Kristy Choi*, Amir Khasahmadi, Anirudh Goyal
International Conference of Machine Learning (ICML), 2021.
[arXiv][code soon]
Encoding Musical Style with Transformer Autoencoders.
Kristy Choi, Curtis Hawthorne, Ian Simon, Monica Dinculescu, Jesse Engel
International Conference of Machine Learning (ICML), 2020.
[arXiv] [code]
Fair Generative Modeling via Weak Supervision.
Kristy Choi*, Aditya Grover*, Trisha Singh, Rui Shu, Stefano Ermon
International Conference of Machine Learning (ICML), 2020.
[arXiv] [code] [press]
Meta-Amortized Variational Inference and Learning.
Mike Wu*, Kristy Choi*, Noah Goodman, Stefano Ermon
AAAI Conference on Artificial Intelligence (AAAI), 2020.
[arXiv]
Neural Joint-Source Channel Coding.
Kristy Choi, Kedar Tatwawadi, Aditya Grover, Tsachy Weissman, Stefano Ermon.
International Conference of Machine Learning (ICML), 2019.
Oral Presentation
[pdf] [code]
Single-cell map of diverse immune phenotypes in the breast tumor microenvironment.
Elham Azizi, Ambrose Carr, George Plitas, Andrew Cornish, Catherine Konopacki, Sandhya Prabhakaran, Juozas Nainys, Kenmin Wu, Vaidotas Kiseliovas,
Manu Setty, Kristy Choi, Rachel Fromme, Phuong Dao, Peter McKenney, Ruby Wasti, Krishna Kadaveru, Linas Mazutis, Alexander Rudensky, Dana Pe'er.
Cell, 174(5), 1293-1308, 2018.
[pdf]
Wishbone identifies bifurcating developmental trajectories from single-cell data.
Manu Setty, Michelle Tadmor, Shlomit Reich-Zeliger, Omer Angel, Tomer Salame, Pooja Kathail, Kristy Choi, Sean Bendall, Nir Friedman, Dana Pe'er.
Nature Biotechnology, 34(6), 637-645, 2016.
[pdf]

Workshop Papers

Tensor Decomposition for Single-cell RNA-seq Data.
Kristy Choi*, Ambrose J. Carr*, Sandhya Prabhakaran, Dana Pe'er
Practical Bayesian Nonparametrics Workshop, NeurIPS 2016.
[pdf]

Teaching

Fall 2019: Head Teaching Assistant for CS236: Deep Generative Models at Stanford

Fall 2018: Teaching Assistant for CS236: Deep Generative Models at Stanford

Spring 2017: Head Teaching Assistant for COMS4117: Machine Learning at Columbia

Service

Reviewer: TMLR, ICLR (2019-2023), ICML (2020-2022), NeurIPS (2019-2022), AISTATS 2022, UAI 2020, AAAI 2020.

Workshop Co-Organizer: Women in Machine Learning @ NeurIPS 2020; Information Theory & Machine Learning (NeurIPS 2019)

Leadership: Women in Machine Learning, Board of Directors (2022 - 2023)