Meta’s ad recommendation engine, powered by deep learning recommendation models (DLRMs), has been instrumental in delivering personalized ads to people.
Keys to success include thousands of human-engineered signals or features in the DLRM-based recommendation system.Foundational transformations along two dimensions that have addressed limitations of traditional DLRMs are event-based learning and learning from sequences.
Meta's new system for ads recommendations uses sequence learning at its core. This necessitated a complete redesign of the ads recommendations system across data storage, feature input formats, and model architecture.
Event-based features - EBFs - are the building blocks for the new sequence learning models. EBFs - an upgrade to traditional features - standardizes heterogeneous inputs to sequence learning models.
An event model synthesizes event embeddings from event attributes. It learns embeddings for each attribute and uses linear compression to summarize them into a single event attributed-based embedding.
Following the redesign to shift from sparse feature learning to event-based sequence learning, the next focus was scaling across two domains — scaling the sequence learning architecture and scaling event sequences to be longer and richer.
Meta’s next-generation recommendation system’s ability to learn directly from event sequences to better understand people’s preferences is further enhanced with longer sequences and richer event attributes.
The impact and future of sequence learning are widely adopted across Meta’s ads systems, resulting in gains in ad relevance and performance, efficient infrastructure, and accelerated research velocity.
Going forward, the focus will be on further scaling event sequences by 100X, developing more efficient sequence modeling architectures like linear attention and state space models, key-value (KV) cache optimization, and multimodal enrichment of event sequences.