Generalized Teacher Forcing for Learning Chaotic Dynamics

Jul 24, 2023

Speakers

About

Chaotic dynamical systems (DS) are ubiquitous in nature and society. Often we are interested in reconstructing such systems from observed time series for prediction or mechanistic insight, where by reconstruction we mean learning geometrical and invariant temporal properties of the system in question. However, training reconstruction algorithms like recurrent neural networks (RNNs) on such systems by gradient-descent based techniques faces severe challenges. This is mainly due to the exploding gradients caused by the exponential divergence of trajectories in chaotic systems. Moreover, for (scientific) interpretability we wish to have as low dimensional reconstructions as possible, preferably in a model which is mathematically tractable. Here we report that a surprisingly simple modification of teacher forcing leads to provably strictly all-time bounded gradients in training on chaotic systems, while still learning to faithfully represent their dynamics. Furthermore, we observed that a simple architectural rearrangement of a tractable RNN design, piecewise-linear RNNs (PLRNNs), enables to reduce the reconstruction dimension to at most that of the observed system (or less). We show on several DS benchmarks and real-world data that with these amendments we can reconstruct DS much better than any state-of-the-art (SOTA) model, in much lower dimensions, leading to a powerful DS reconstruction algorithm which is highly interpretable at the same time. We show on several DS that with these amendments we can reconstruct DS better than current SOTA algorithms, in much lower dimensions. Performance differences were particularly compelling on real world data with which most other methods severely struggled. This work thus led to a simple yet powerful DS reconstruction algorithm which is highly interpretable at the same time.

Organizer

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2023