View abstract

Session II.4 - Foundations of Data Science and Machine Learning

Friday, June 16, 16:30 ~ 17:30

Information theory through kernel methods

Francis Bach

Inria - Ecole Normale Supérieure, France   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

Estimating and computing entropies of probability distributions are key computational tasks throughout data science. In many situations, the underlying distributions are only known through the expectation of some feature vectors, which has led to a series of works within kernel methods. In this talk, I will explore the particular situation where the feature vector is a rank-one positive definite matrix, and show how the associated expectations (a covariance matrix) can be used with information divergences from quantum information theory to draw direct links with the classical notions of Shannon entropies.

View abstract PDF