Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Simple initialization and parametrization of sinusoidal networks via their kernel bandwidth

Filipe Belbute-Peres · Zico Kolter

Keywords: [ Deep Learning and representational learning ] [ neural tangent kernel ] [ implicit models ] [ periodic ] [ physics informed ] [ sinusoidal ]


Abstract:

Neural networks with sinusoidal activations have been proposed as an alternative to networks with traditional activation functions. Despite their promise, particularly for learning implicit models, their training behavior is not yet fully understood, leading to a number of empirical design choices that are not well justified. In this work, we first propose a simplified version of such sinusoidal neural networks, which allows both for easier practical implementation and simpler theoretical analysis. We then analyze the behavior of these networks from the neural tangent kernel perspective and demonstrate that their kernel approximates a low-pass filter with an adjustable bandwidth. Finally, we utilize these insights to inform the sinusoidal network initialization, optimizing their performance for each of a series of tasks, including learning implicit models and solving differential equations.

Chat is not available.