Abstract
Join Zoom Meeting
https://riken-jp.zoom.us/j/9456523132?pwd=SVVvMDNCM0JJL2hUWi8rL0NQWUlydz09
Meeting ID: 945 652 3132
Passcode: 117764
- Title
Out of distribution generalization
- Abstract
Supervised learning typically requires test data being drawn from the same distribution as the training data. This assumption does not hold in many real-world settings. As an example, consider a model trained in one hospital and used in another. In such a setting, the test distribution is related to the training distribution, but they are not the same, and models need to generalize out of distribution. In this talk, I will discuss our work on representation learning for out of distribution generalization. I will construct a family of representations that generalize under changing spurious correlations with applications to images and chest X-rays. Additionally, I will go into some geometric aspects of generalization.
References:
https://arxiv.org/pdf/2107.00520.pdf
https://arxiv.org/pdf/2210.01302.pdf
- Bio
Rajesh Ranganath is an assistant professor at NYU’s Courant Institute
of Mathematical Sciences and the Center for Data Science. He is also
affiliate faculty at the Department of Population Health at NYUMC. His
research focuses on approximate inference, causal inference,
probabilistic models, and machine learning for healthcare. Rajesh
completed his PhD at Princeton and BS and MS from Stanford University.
Rajesh has won several awards including the NDSEG graduate fellowship,
the Porter Ogden Jacobus Fellowship, given to the top four doctoral
students at Princeton University, the Savage Award in Theory and
Methods, and an NSF Career Award.
More Information
Date | August 25, 2023 (Fri) 11:00 - 12:30 |
URL | https://c5dc59ed978213830355fc8978.doorkeeper.jp/events/160000 |