Skip to yearly menu bar Skip to main content


Poster

Butterfly Effects of SGD Noise: Error Amplification in Behavior Cloning and Autoregression

Adam Block · Dylan Foster · Akshay Krishnamurthy · Max Simchowitz · Cyril Zhang

Halle B #58
[ ]
Wed 8 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

This work studies training instabilities of behavior cloning with deep neural networks. We observe that minibatch SGD updates to the policy network during training result in sharp oscillations in long-horizon rewards, despite negligibly affecting the behavior cloning loss. We empirically disentangle the statistical and computational causes of these oscillations, and find them to stem from the chaotic propagation of minibatch SGD noise through unstable closed-loop dynamics. While SGD noise is benign in the single-step action prediction objective, it results in catastrophic error accumulation over long horizons, an effect we term gradient variance amplification (GVA). We demonstrate that many standard mitigation techniques do not alleviate GVA, but that taking an exponential moving average (EMA) of iterates is surprisingly effective at doing so. Furthermore, we illustrate the generality of the phenomenon by showing both the existence of GVA and its amelioration by EMA in autoregressive language generation. Finally, we provide theoretical vignettes both exhibiting the benefits of EMA in alleviating GVA and illustrating the extent to which classical convex models help in understanding the benefits of iterate averaging in deep learning.

Live content is unavailable. Log in and register to view live content