talk-data.com
Speaker
Harrison Chase
12
talks
CEO and Co-founder of LangChain
Bio from: Databricks DATA + AI Summit 2023
Frequent Collaborators
Filter by Event / Source
Talks & appearances
12 activities · Newest first
Keynote speaker from LangChain
Cloud Run is an ideal platform for hosting AI applications – for example, you can use Cloud Run with AI frameworks like LangChain or Firebase Genkit to orchestrate calls to AI models on Vertex AI, vector databases, and other APIs. In this session, we’ll dive deep into building AI agents on Cloud Run to solve complex tasks and explore several techniques, including tool calling, multi-agent systems, memory state management, and code execution. We’ll showcase interactive examples using popular frameworks.
Generative AI agents have emerged as the leading architecture for implementing complex application functionality. Tools are the way that agents access the data and systems they need. But building and deploying tools at scale brings new challenges. Learn how MCP Toolbox for Databases, an open source server for gen AI tool management, enables platforms like LangGraph and Vertex AI to easily connect to enterprise databases.
Learn how trailblazing startups use AI agents to transform operations and drive growth. Discover actionable strategies to streamline processes, boost productivity, and unlock groundbreaking solutions with AI. And gain invaluable insights directly from industry leaders.
AI is revolutionizing the startup landscape, unlocking new possibilities for innovation, efficiency, and growth. Join industry leaders as they share bold predictions, critical challenges, and practical strategies to help you harness AI for your business in 2025 and beyond.
LangChain is the most popular open-source framework for building LLM-based apps. Google Cloud is the easiest place to deploy LangChain apps to production. In this session technical practitioners will learn how to combine LangChain on Cloud Run with Cloud SQL's pgvector for vector storage and Vertex Endpoints to create generative AI applications.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Dive into the world of Google Cloud databases and discover how to transform the way you build and deploy AI-powered applications. This spotlight session reveals cutting-edge innovations that will help you modernize your database estate to easily build enterprise generative AI apps, unify your analytical and transactional workloads, and simplify database management with assistive AI. Join us to hear our vision for the future of Google Cloud databases and see how we're pushing the boundaries alongside the AI ecosystem. Expect exciting product announcements, insightful demos, and real-world customer success stories highlighting the transformative value of Google's Data Cloud.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
The advent of Generative AI has ushered in an unprecedented era of innovation, marked by the transformative potential of Large Language Models (LLMs). The immense capabilities of LLMs open up vast possibilities for revolutionizing business ops and customer interactions. However, integrating them into production environments presents unique orchestration challenges. Successful orchestration of LLMs for Retrieval Augmented Generation (RAG) depends on addressing statelessness and providing access to the most relevant, up-to-date information. This session will dive into how to leverage LangChain and Google Cloud Databases to build context-aware applications that harness the power of LLMs. Please note: seating is limited and on a first-come, first served basis; standing areas are available
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Harrison Chase is the CEO and Co-founder of LangChain, a company formed around the popular open-source Python/Typescript packages. After studying stats and computer science at Harvard, Harrison also went on to lead the machine learning team at Robust Intelligence (an MLOps company) and the entity linking team at Kensho (a fintech startup).
In this fireside chat, he will discuss how LangChain is making it easier to use large language models (LLMs) to develop context-aware reasoning applications. Leveraging the Google ecosystem, they are testing, evaluating, and observing common patterns for building more complex state machines and agents.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Hear from three guests. Harrison Chase (CEO, LangChain) and Nicolas Palaez (Sr. Technical Marketing Manager, Databricks) on LLMs and generative AI. Third guest, Drew Banin (co-founder, dbt Labs), discusses analytics engineering workflow with his company dbt Labs, how he started the company, and how they provide value with the Databricks partnership. Hosted by Ari Kaplan (Head of Evangelism, Databricks) and Pearl Ubaru (Sr Technical Marketing Engineer, Databricks)
Connect with us: Website: https://databricks.com Twitter: https://twitter.com/databricks LinkedIn: https://www.linkedin.com/company/databricks Instagram: https://www.instagram.com/databricksinc Facebook: https://www.facebook.com/databricksinc
0:00 Open 6:08 Ali Ghodsi & Marc Andreessen 32:06 Reynold Xin 48:09 Michael Armbrust 1:00:00 Matei Zaharia & Panel 1:27:10 Hannes Muhleisen 01:37:43 Harrison Chase 01:49:15 Lin Qiao 02:05:03 Jitendra Malik 02:21:15 Arsalan & Eric Schmidt