JAX is an ML framework that has been widely adopted by foundation model builders because of its advantages like high performance, scalability, composability, and ease of programmability. In this session, we will showcase the entire ecosystem supported by JAX for end-to-end foundation model building from data loading to training and inference on both TPUs and GPUs. We'll highlight the entire JAX stack, including high performance implementations for large language models and diffusion models in MaxText and MaxDiffusion. Learn how customers such as Assembly AI, Cohere, Anthropic, MidJourney, Stability AI, and partners like Nvidia, have adopted JAX for building foundation models on Google Cloud and beyond.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
talk-data.com
N
Speaker
Nitin Nitin
1
talks
Manager DL Training Performance
NVIDIA
Filter by Event / Source
Talks & appearances
1 activities · Newest first
with
William Pispico Ferreira
(Assembly AI)
,
Rajesh Anantharaman
(Google Cloud)
,
Dwarak Talupuru
(Cohere)
,
Domenic Donato
(Assembly AI)
,
Nitin Nitin
(NVIDIA)