View abstract

Session III.2 - Approximation Theory

Monday, June 19, 17:30 ~ 18:30

Efficient Neural Group Invariant Embeddings

Nadav Dym

Technion, Israel   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

In many machine learning tasks, the goal is to learn an unknown function which has some known group symmetries. Equivariant machine learning algorithms exploit this by devising architectures (=function spaces) which have these symmetries by construction. Examples include convolutional neural networks which respect translation symmetries, neural networks for graphs or sets which respect their permutation symmetries, or neural networks for 3D point sets which additionally respect Euclidean symmetries.

A common theoretical requirement of symmetry based architecture is that they will be able to approximate any continuous function with the same symmetries. Using Stone-Weirstrass, this boils down to the ability of a function class to separate any two objects which are not related by a group symmetry . We will review results showing that under very general assumptions such a symmetry preserving separating mapping f exists, and the embedding dimension m can be taken to be roughly twice the dimension of the data. We will then propose a general methodology for efficient computation of such f using random invariants. This methodology is a generalization of the algebraic geometry argument used for the well known proof of phase retrieval injectivity. Finally, we will show how this result can be used to achieve efficient separation for point sets with respect to permutation and/or rotation actions.

Based on the papers:

[1] Low dimensional Invariant Embeddings for Universal Geometric Learning Nadav Dym and Steven J. Gortler

[2] Complete Neural Networks for Euclidean Graphs Snir Hordan, Tal Amir, Steven J. Gortler and Nadav Dym

Joint work with Steven J. Gortler (Harvard University, USA), Snir Hordan (Technion, Israel) and Tal Amir (Technion, Israel).

View abstract PDF