talk-data.com talk-data.com

H

Speaker

Harrison Chase

4

talks

CEO LangChain

CEO and Co-founder of LangChain

Bio from: Databricks DATA + AI Summit 2023

Frequent Collaborators

Filtering by: Google Cloud Next '24 ×

Filter by Event / Source

Talks & appearances

Showing 4 of 12 activities

Search activities →

LangChain is the most popular open-source framework for building LLM-based apps. Google Cloud is the easiest place to deploy LangChain apps to production. In this session technical practitioners will learn how to combine LangChain on Cloud Run with Cloud SQL's pgvector for vector storage and Vertex Endpoints to create generative AI applications.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

session
with Hari Ramamurthy (The Home Depot) , Sandy Ghai (Google Cloud) , Andi Gutmans (Google) , Gabe Weiss (Google Cloud) , Harrison Chase (LangChain) , Anita Kibunguchy-Grant (Google Cloud)

Dive into the world of Google Cloud databases and discover how to transform the way you build and deploy AI-powered applications. This spotlight session reveals cutting-edge innovations that will help you modernize your database estate to easily build enterprise generative AI apps, unify your analytical and transactional workloads, and simplify database management with assistive AI. Join us to hear our vision for the future of Google Cloud databases and see how we're pushing the boundaries alongside the AI ecosystem. Expect exciting product announcements, insightful demos, and real-world customer success stories highlighting the transformative value of Google's Data Cloud.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

The advent of Generative AI has ushered in an unprecedented era of innovation, marked by the transformative potential of Large Language Models (LLMs). The immense capabilities of LLMs open up vast possibilities for revolutionizing business ops and customer interactions. However, integrating them into production environments presents unique orchestration challenges. Successful orchestration of LLMs for Retrieval Augmented Generation (RAG) depends on addressing statelessness and providing access to the most relevant, up-to-date information. This session will dive into how to leverage LangChain and Google Cloud Databases to build context-aware applications that harness the power of LLMs. Please note: seating is limited and on a first-come, first served basis; standing areas are available

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Harrison Chase is the CEO and Co-founder of LangChain, a company formed around the popular open-source Python/Typescript packages. After studying stats and computer science at Harvard, Harrison also went on to lead the machine learning team at Robust Intelligence (an MLOps company) and the entity linking team at Kensho (a fintech startup).

In this fireside chat, he will discuss how LangChain is making it easier to use large language models (LLMs) to develop context-aware reasoning applications. Leveraging the Google ecosystem, they are testing, evaluating, and observing common patterns for building more complex state machines and agents.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.