menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

TabFlex: S...
source image

Arxiv

3d

read

111

img
dot

Image Credit: Arxiv

TabFlex: Scaling Tabular Learning to Millions with Linear Attention

  • Recent advancements in tabular classification leverage the in-context learning capability of Large Language Models (LLMs), with TabFlex being the latest model enhancing efficiency and scalability for larger datasets.
  • TabFlex incorporates linear attention mechanisms, enabling seamless scaling to handle tabular datasets with thousands of features and hundreds of classes, processing over a million samples in just 5 seconds.
  • Extensive evaluations show that TabFlex achieves over a 2x speedup compared to TabPFN and a 1.5x speedup over XGBoost, outperforming 25 tested baselines in terms of efficiency across diverse datasets.
  • TabFlex demonstrates strong performance on large-scale datasets with reduced computational costs when combined with data-efficient techniques such as dimensionality reduction and data sampling.

Read Full Article

like

6 Likes

For uninterrupted reading, download the app