<ul data-eligibleForWebStory="true">Prior-data fitted networks (PFNs) are transformers pre-trained on synthetic data to enable Bayesian inference through in-context learning.CausalFM is introduced as a framework for training PFN-based foundation models in causal inference settings.It formalizes Bayesian priors for causal inference based on structural causal models (SCMs) and derives criteria for valid priors.A new family of prior distributions using causality-inspired Bayesian neural networks is proposed in CausalFM.CausalFM supports Bayesian causal inference in back-door, front-door, and instrumental variable adjustment settings.A foundation model for estimating conditional average treatment effects (CATEs) using back-door adjustment is trained explicitly in CausalFM.CausalFM shows competitive performance for CATE estimation with various benchmarks.The framework can serve as a general recipe for training foundation models in causal inference across disciplines.CausalFM presents a new paradigm that could potentially revolutionize how causal inference is conducted in fields like medicine and economics.