talk-data.com talk-data.com

Sharon Zhou

Speaker

Sharon Zhou

2

talks

CEO & Cofounder Lamini

Dr. Sharon Zhou is the co-founder and CEO of Lamini, which won this year’s VentureBeat Gen AI Startup Award and is a Forbes Cloud 100 Rising Star. A former Stanford faculty member, she led a 50+-person Generative AI research group, published award-winning work in generative AI, and teaches Coursera courses on AI, including Fine-tuning LLMs. She earned a PhD in AI from Stanford under Andrew Ng and previously worked as an AI product manager at Google; she holds a bachelor’s degree in computer science and Classics from Harvard and has served as an AI advisor in Washington, D.C., with recognition from MIT Technology Review’s 35 Under 35 list.

Bio from: Data + AI Summit 2025

Filter by Event / Source

Talks & appearances

2 activities · Newest first

Search activities →
Composing High-Accuracy AI Systems With SLMs and Mini-Agents

This session is repeated. For most companies, building compound AI systems remains aspirational. LLMs are powerful, but imperfect, and their non-deterministic nature makes steering them to high accuracy a challenge. In this session, we’ll demonstrate how to build compound AI systems using SLMs and highly accurate mini-agents that can be integrated into agentic workflows. You'll learn about breakthrough techniques, including: memory RAG, an embedding algorithm that reduces hallucinations using embed-time compute to generate contextual embeddings, improving indexing and retrieval, and memory tuning, a finetuning algorithm that reduces hallucinations using a Mixture of Memory Experts (MoME) to specialize models with proprietary data. We’ll also share real-world examples (text-to-SQL, factual reasoning, function calling, code analysis and more) across various industries. With these building blocks, we’ll demonstrate how to create high accuracy mini-agents that can be composed into larger AI systems.

Your LLM, Your Data, Your Infrastructure

Lamini, the most powerful LLM engine, is the platform for any and every software engineer to ship an LLM into production as rapidly and as easily as possible. In this session, learn how to train your LLM on your own data and infrastructure with a few lines of code using the Lamini library. Get early access to a playground to train any open-source LLM. With Lamini, your own LLM comes with better performance, better data privacy, lower cost, lower latency, and more.

Talk by: Sharon Zhou

Here’s more to explore: LLM Compact Guide: https://dbricks.co/43WuQyb Big Book of MLOps: https://dbricks.co/3r0Pqiz

Connect with us: Website: https://databricks.com Twitter: https://twitter.com/databricks LinkedIn: https://www.linkedin.com/company/databricks Instagram: https://www.instagram.com/databricksinc Facebook: https://www.facebook.com/databricksinc