talk-data.com talk-data.com

Topic

LLM

Large Language Models (LLM)

nlp ai machine_learning

165

tagged

Activity Trend

158 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Google Cloud Next '24 ×

Retrieval Augmented Generation (RAG) is a powerful technique to provide real time, domain-specific context to the LLM to improve accuracy of responses. RAG doesn't require the addition of sensitive data to the model, but still requires application developers to address security and privacy of user and company data. In this session, you will learn about security implications of RAG workloads and how to architect your applications to handle user identity and to control data access.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Ready to deploy lightning-fast AI apps to the web? Angular v18 brings a set of forward-thinking features to the web, setting new standards for performance and developer experience. Learn how Firebase hosting with Angular allows you to efficiently build and deploy with Google’s Gemini generative AI.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

We are bringing Google’s research and innovations in artificial intelligence (AI) directly to your data in BigQuery. Join this session to learn about BigQuery’s built-in ML capabilities, such as model inferences, and how to use Gemini, Google's most capable and flexbile AI model yet, directly within BigQuery to simplify advanced use cases such as sentiment analysis, entity extraction, and many more.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

This in-depth technical session delves into real-world use cases, and design patterns for building generative AI applications using Google Cloud Databases, Vertex AI, and popular open-source orchestration frameworks such as LangChain. We’ll showcase a sample application that leverages the versatility of Google Cloud Databases to implement dynamic grounding using various RAG techniques, giving you valuable insights on implementing enterprise-grade gen AI solutions.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Application development needs are growing to keep pace with evolving businesses. Developers are familiar with multiple languages, front-end frameworks, platform technologies, and more. In this workshop, learn how Gemini is re-imagining the developer experience to enable you to stay in flow, context-switch less, and get code to production faster. If you’re not talking about AI Assistance - your competitor is. Gemini is one of the biggest door openers in 2024. Don't forget to bring your laptop! It's recommended to have Visual Studio Code and Flutter installed prior to this workshop.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Building an assistant capable of answering complex, company-specific questions and executing workflows requires first building a powerful Retrieval Augmented Generation (RAG) system. Founding engineer Eddie Zhou explains how Glean built its RAG system on Google Cloud— combining a domain-adapted search engine with dynamic prompts to harness the full capabilities of Gemini's reasoning engine. By attending this session, your contact information may be shared with the sponsor for relevant follow up for this event only.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Gain insight into building domain-specific GenAI applications for the enterprise through the lens of the legal use case. Discover essential components of a comprehensive LLM ecosystem and operational model, and strategies for delivering relevant, accurate outputs with safe and ethical guardrails. Hear from PwC's Vince DiMascio on the real-world insights from a venture capital firm's successful legal GenAI implementation Don't miss this opportunity - discover the power of GenAI. By attending this session, your contact information may be shared with the sponsor for relevant follow up for this event only.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

In this mini course, you will explore Vertex AI Conversation. Vertex AI Conversation is a set of generative conversational features built on Dialogflow and Vertex AI. With these features, you can now use large language models (LLMs) to parse and comprehend content, generate agent responses, and control conversation flow. You will explore these features and create your own data store agent in a hands-on lab experience.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

This session demonstrates how to use large language models (LLMs) to translate ideas directly into cloud architecture blueprints. You’ll learn how to generate designs from these blueprints with natural language processing. We’ll also use an existing LLM model specialized in code generation to understand our language dialect to generate cloud architecture diagrams. Finally, we'll also show you a web app on Google Cloud that allows users to interact with the model and use the generated artifacts in practice.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

If you're a business leader looking to get a better understanding of how artificial intelligence (AI) is helping developers become more productive and focused, or a developer interested in using AI to achieve a “flow state” then this talk is for you. Google Developer Advocates Debi Cabrera and Daryl Ducharme will share their tips and tricks for using Gemini to stay focused, and share ways that business leaders can help their developers by providing tools and training to assist with developer productivity.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Ready to supercharge your Java skills with cutting-edge generative AI? Dive into this immersive hands-on workshop and learn how to build and deploy powerful gen AI applications in Cloud Run using gen AI with Vertex and Gemini models. We'll explore fast Java development, leverage the scalability of Cloud Run, and tackle real-world gen AI use cases. Get ready and unleash the power of AI in your next application!

Work with a complete end-to-end sample application, guided at all times by the power of Gemini.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Financial institutions around the globe are required to audit communication channels to be in compliance with regulations. Learn how Behavox is leveraging Google Cloud technology and LLMs to provide industry leading regulatory compliance and front office solutions for financial institutions globally. Please note: seating is limited and on a first-come, first served basis; standing areas are available

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Don’t miss the brilliance of Gemini in a real world live demo that showcases two examples of the mind blowing accuracy and power (and fun!) of AI. Witness a food and wine pairing plus a style session that is sure to be Next’s most stylish and delish live demo! By attending this session, your contact information may be shared with the sponsor for relevant follow up for this event only.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

In this presentation, we delve into the cutting-edge realm of large-scale AI training and inference, focusing on the open models and their deployment on Google Cloud Accelerators. Open models such as the Llama family of LLMs and Gemma are state-of-the-art language models that demand robust computational resources and efficient strategies for training and inference at scale. This session aims to provide a comprehensive guide on harnessing the power of PyTorch on Google Cloud Accelerators, specifically designed to meet the high-performance requirements of such models.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

I'm here to tell you about a game-changer: Gemini, and how it transformed my approach to apps script support. Recently I was working with a client who needed to transition a custom apps script they developed over to a newly formed support team. The support team did not have any background in Google Apps Script and the team consisted of a mix of project managers, business analysts and Google Workspace admins.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

In the era of multimodal generative AI, a unified governance-focused data platform powered by Gemini becomes paramount. Join this session to learn how BigQuery fuels your data and AI lifecycle from training to inference, by unifying structured and unstructured data such as text, images and audio, while encompassing security and governance. Learn how Priceline is using BigQuery and Vertex AI to reinvent customer experiences and lead the industry in data and AI innovation.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Join us to learn how to activate the full potential of your data with AI in BigQuery. Take an in-depth look at how BigQuery's core integration with generative AI models like Gemini, coupled with its petabyte-scale analytics capabilities, enables new possibilities for gaining insights from your data. Learn how to derive insights from your untapped and unstructured data such as images, documents, and audio files, and explore BigQuery vector search and multi-modal embeddings, all powered by Google's industry-leading AI capabilities in BigQuery using simple Cloud SQL queries. You will also learn how Unilever is creating a data strategy that allows data teams to scale efficiently and rapidly experiment with AI models and gen AI use cases.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Learn how Thales, a leader in global data security, has partnered with Google Workspace to help organizations across the world maintain granular data control and localized encryption to meet stringent regional and industry requirements around data privacy and sovereignty. Thales and Google will share how organizations can leverage Gemini and Google Workspace apps while keeping their sensitive data private.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Come to this fireside chat with Seth Vargo to learn more about the ultimate hybrid cloud use case. We'll explore use cases where Alphabet products run on some of your favorite Google Cloud offerings such as Google Kubernetes Engine (GKE). Why don't we run everything at Alphabet on Google Cloud? Why do some products run partially on cloud? How do Alphabet engineers take advantage of products like GKE, Cloud Run, Vertex AI, and Gemini exposed over hybrid channels? We'll shed light on how our internal innovation influences the products available to our customers and vice-versa.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

The reality is no one LLM will always be completely safe. The good news is that hallucinations can tell you where your AI lifecycle needs more attention. Learn how you can modernize your Google AI infrastructure to preemptively avoid some of the common weak spots and how you can build an AI stack that will give you the agility to fix problems as they occur. By attending this session, your contact information may be shared with the sponsor for relevant follow up for this event only.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.