Skip to main content

Learning invariant weights in neural networks

Tycho F.A. van der Ouderaa and Mark van der Wilk

Abstract

Assumptions about invariances or symmetries in data can significantly increase the predictive power of statistical models. Many commonly used machine learning models are constraint to respect certain symmetries, such as translation equivariance in convolutional neural networks, and incorporating other symmetry types is actively being studied. Yet, learning invariances from the data itself remains an open research problem. It has been shown that the marginal likelihood offers a principled way to learn invariances in Gaussian Processes. We propose a weight-space equivalent to this approach, by minimizing a lower bound on the marginal likelihood to learn invariances in neural networks, resulting in naturally higher performing models.

Book Title
Proceedings of the Thirty−Eighth Conference on Uncertainty in Artificial Intelligence (UAI)
Editor
Cussens‚ James and Zhang‚ Kun
Month
Aug
Pages
1992–2001
Publisher
PMLR
Series
Proceedings of Machine Learning Research
Volume
180
Year
2022