<ul data-eligibleForWebStory="true">Tabular in-context learning has seen advancements with models such as TabPFN and TabICL extending to larger datasets.Current table-native ICL architectures train on synthetic data and lack leveraging the semantics in real-world tabular data.Models like TabuLa-8B use large language models for semantic understanding but have limitations in processing context.ConTextTab combines semantic understanding with a table-native ICL framework for improved performance.The model employs specialized embeddings for different data modalities and trains on real-world tabular data.ConTextTab achieves competitive results compared to state-of-the-art models across various benchmarks.The model sets a new standard on the CARTE benchmark, known for its semantic richness.