Dec 6, 2022
It has been studied for centuries to predict the evolution of the Earth system due to its significant impact on human lives. Conventionally, Earth system (e.g., weather and climate) forecasting models rely on numerical simulation of complex physical models and are hence expensive in both computational resources and domain expertise. With the explosive growth of Earth observation data in the past decade, data-driven models that apply Deep Learning (DL) are demonstrating impressive potential for various Earth system forecasting tasks. So far, these DL models mainly use Convolutional Neural Networks (CNNs) or Recurrent Neural Networks (RNNs) as the basic building blocks. The Transformer architecture, despite its broad success in other domains, has limited adoption for Earth system forecasting. In this paper, we propose Earthformer, a space-time Transformer for Earth system forecasting. Earthformer is based on a generic, flexible and efficient space-time attention block, named Cuboid Attention, which decomposes the data to cuboids and applies cuboid-level self-attention in parallel. These cuboids are further connected with a collection of global vectors. We conduct experiments on the MovingMNIST dataset and a newly proposed chaotic N-body MNIST dataset to verify the effectiveness of cuboid attention and figure out the best design for Earthformer. Experiments on two real-world benchmarks about precipitation nowcasting and El Niño/Southern Oscillation (ENSO) forecasting show Earthformer achieves state-of-the-art performance.
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker