menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Scalable S...
source image

Arxiv

2d

read

277

img
dot

Image Credit: Arxiv

Scalable Spatiotemporal Inference with Biased Scan Attention Transformer Neural Processes

  • Neural Processes (NPs) are models that predict stochastic processes' posterior predictive distribution.
  • Modern NPs handle complex applications such as geology, epidemiology, climate, and robotics.
  • The scalability of NPs has become crucial due to data-hungry applications.
  • A new architecture, Biased Scan Attention Transformer Neural Process (BSA-TNP), is proposed.
  • BSA-TNP introduces Kernel Regression Blocks (KRBlocks) and group-invariant attention biases.
  • BSA-TNP uses memory-efficient Biased Scan Attention (BSA) for scalability.
  • BSA-TNP matches or surpasses the accuracy of top models while training faster.
  • It exhibits translation invariance and can learn at multiple resolutions simultaneously.
  • BSA-TNP can model processes evolving in space and time and support high dimensional fixed effects.
  • The model can perform inference with over 1M test points and 100K context points in under a minute on a single 24GB GPU.

Read Full Article

like

16 Likes

For uninterrupted reading, download the app