menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

CodeBrain:...
source image

Arxiv

2d

read

113

img
dot

Image Credit: Arxiv

CodeBrain: Bridging Decoupled Tokenizer and Multi-Scale Architecture for EEG Foundation Model

  • Researchers introduce CodeBrain, an efficient EEG foundation model for capturing multi-scale brain dependencies.
  • CodeBrain aims to address challenges in traditional EEG models related to channel configurations and task objectives.
  • CodeBrain is trained in two stages: TFDual-Tokenizer for heterogeneous temporal and frequency tokenization and EEGSSM for modeling dependencies.
  • TFDual-Tokenizer enables a quadratic expansion of discrete representation space and offers interpretability through cross-domain token analysis.
  • EEGSSM combines global convolution architecture and sliding window attention to capture long-range and local dependencies efficiently.
  • EEGSSM better reflects the brain's small-world topology compared to fully connected Transformer models.
  • CodeBrain's training includes a masked self-supervised learning objective to predict token indices.
  • Experiments on 10 public EEG datasets show CodeBrain's generalizability via linear probing.
  • CodeBrain offers biologically informed and interpretable EEG modeling, laying the foundation for future neuroscience research.
  • Both code and pretraining weights for CodeBrain will be released in a future version.

Read Full Article

like

6 Likes

For uninterrupted reading, download the app