Workshops Period II.

June 15, 16, 17

II.4: Foundations of Data Science and Machine Learning

Room 108 (corridor 44-45)

Organizers:

Speakers

Semi-plenary speakers
Invited speakers

Preliminary program

This schedule is preliminary and could be updated.

Thursday, June 15
14:30 ~ 15:00 On structured linear measurements for tensor data recovery
Liza Rebrova - Princeton University, USA
15:00 ~ 15:30 Dimension-free limits of stochastic gradient descent for two-layers neural networks
Bruno Loureiro - École Normale Supérieure, France
15:30 ~ 16:00 To split or not to split that is the question: From cross validation to debiased machine learning
Morgane Austern - Harvard, United States
16:30 ~ 17:00 Spectral methods for clustering signed and directed networks and heterogeneous group synchronization
Mihai Cucuringu - University of Oxford, United Kingdom
17:00 ~ 17:30 Bilipschitz invariants
Dustin Mixon - The Ohio State University, USA
17:30 ~ 18:00 Near-Optimal Bounds for Generalized Orthogonal Procrustes Problem via Generalized Power Method
Shuyang Ling - New York University Shanghai, China
Friday, June 16
14:30 ~ 15:00 The Monge Gap, A Regularizer to Learn Optimal Transport Maps
Marco Cuturi - Apple / CREST-ENSAE, France
15:00 ~ 15:30 Shifted divergences for sampling, privacy, and beyond
Jason Altschuler - NYU / UPenn, USA
15:30 ~ 16:00 Bilevel optimization for machine learning
Pierre Ablin - Apple, France
16:30 ~ 17:30 Information theory through kernel methods
Francis Bach - Inria - Ecole Normale Supérieure, France
Saturday, June 17
15:00 ~ 15:30 Neural Networks as Sparse Local Lipschitz Functions: Robustness and Generalization
Jeremias Sulam - Johns Hopkins University, United States
15:30 ~ 16:00 Understanding Deep Representation Learning via Neural Collapse
Qing Qu - University of Michigan, United States
16:30 ~ 17:30 Robust regression revisited
Po-Ling Loh - Cambridge, United Kingdom
Posters

 

All abstracts

View all abstracts of this workshop.