This talk explores how foundation models, originally developed for unstructured data such as text and images, are now enabling in-context learning on structured relational data. We will examine how recent developments allow these models to generalize across diverse tabular prediction tasks without retraining, by leveraging schema-aware representations and attention mechanisms over multi-table structures. The session will highlight emerging research directions at the intersection of deep learning, graph-based transformer architectures, and multi-modal relational datasets. Throughout the presentation, we will learn how these recent innovations allow an expert practitioner to reduce the time to prediction from months to seconds by introducing predictive models that operate directly on the raw database.
talk-data.com
Speaker
Matthias Fey
2
talks
Matthias Fey is the creator of PyTorch Geometric (PyG), a leading library for representation learning on graphs. At Kumo.ai, he heads the Research team, driving innovation in foundation models, graph based architectures and scalable ML systems. Prior to this, he completed his PhD on Message Passing for Learning over Graph Structured Data at TU Dortmund University. His work bridges early-stage research with real-world outcomes across a wide range of industry applications.
Bio from: AI-Powered Data & Search: Unlocking Intelligence Across Systems
Filter by Event / Source
Talks & appearances
2 activities · Newest first
Learn how to build and analyze heterogeneous graphs using PyG, a machine graph learning library in Python. This workshop will provide a practical introduction to the concept of heterogeneous graphs and their applications, including their ability to capture the complexity and diversity of real-world systems. Participants will gain experience in creating a heterogeneous graph from multiple data tables, preparing a dataset, and implementing and training a model using PyG.