A Hierarchical Spatial Transformer for Massive Point Samples in Continuous Space

Dec 10, 2023

Speakers

About

Given a set of point samples in continuous space with explanatory features and target response variables, the problem is to learn the spatial structured representation of the continuous space and to infer the target variable at any new point location. The problem is fundamental with broad applications, such as coastal water quality monitoring, air quality monitoring, and operator learning on physical simulations. However, the problem is challenging due to the implicit dependency structure on irregular point locations in continuous space, the potential high computational costs of modeling long-range interactions between a large number of points, and the risks of over-confident predictions due to local point sparsity. Existing works either assume a given spatial neighbor structure or infer unknown spatial dependency under a rigid distribution assumption. In recent years, transformer architecture has been widely used to learn spatial representation, but existing methods are primarily designed for regular grids or graphs and thus cannot be directly applied to irregular point samples in continuous space. There are also works related to operator learning on numerical simulations in the continous space, but these methods often do not address the hierarchical spatial representation on irregular points. To fill this gap, this paper proposes a new hierarchical spatial transformer model for a large number of irregular point samples in continuous space. Our key idea is to learn the multi-scale spatial representation through a quad-tree hierarchy and to conduct efficient attention operations by using a coarser representation for distant points. We also design an uncertainty quantification component to measure spatial prediction confidence. Evaluations on several datasets confirm that our method outperforms multiple baselines in existing works.

Organizer

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2023