talk-data.com talk-data.com

Topic

LLM

Large Language Models (LLM)

nlp ai machine_learning

165

tagged

Activity Trend

158 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Google Cloud Next '24 ×

In this spotlight lab, you will learn how to use Gemini, an AI-powered collaborator in Google Cloud, to navigate and understand different areas of security in your environment in Security Command Center.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

A day in the life of a Google Cloud developer typically involves the use of multiple Google Cloud products and services. These products enable the developer to develop, test, deploy, and manage applications in the cloud. With assistance from Gemini, a developer can become more productive when using Google Cloud's products by using Gemini's interactive chat, code assistance, and embedded integrations. In this spotlight lab you will explore Gemini in an hands-on lab environment to see the different ways in which Gemini can be used in your development workflows.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Enabling generative AI powered applications on-premise can be challenging due to the complexity of managing infrastructure, the massive scale of data, and security and compliance. Customers with stringent data residency requirements can now access large language models (LLMs) that can be easily integrated in their applications automating natural language tasks such as transcription, text classification that make their workforce productive, or other application areas such as network mapping.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

This session will highlight new capabilities of Gemini for Google Cloud that allow it to directly integrate with your Google Cloud infrastructure and data to provide highly personalized answers to questions about your cloud environment in real time. We’ll show how Gemini for Google Cloud can help you optimize workloads, observe your infrastructure and applications, and troubleshoot issues by using AI to query your cloud environment and provide tailored insights.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

The advent of Generative AI has ushered in an unprecedented era of innovation, marked by the transformative potential of Large Language Models (LLMs). The immense capabilities of LLMs open up vast possibilities for revolutionizing business ops and customer interactions. However, integrating them into production environments presents unique orchestration challenges. Successful orchestration of LLMs for Retrieval Augmented Generation (RAG) depends on addressing statelessness and providing access to the most relevant, up-to-date information. This session will dive into how to leverage LangChain and Google Cloud Databases to build context-aware applications that harness the power of LLMs. Please note: seating is limited and on a first-come, first served basis; standing areas are available

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Modern apps and organizations are highly complex, comprising numerous vendors, products, and ever-evolving solutions. This dynamic environment makes it challenging to operate solely from an infrastructure viewpoint. App Hub addresses this by organizing infrastructure resources into an app-centric view that mirrors your business. In this session, we’ll demo how Gemini and App Hub can help app operators and security admins, use an app-centric approach to quickly obtain holistic context on the state of the system and recent events affecting it.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Discover the latest innovations within Google's machine learning ecosystem to help you navigate technology to bring ML to any device, website, or app. Uncover the power of generative AI with Gemini. From ideas to accelerated workflows, learn how to leverage Gemini’s capabilities for creative text formats, code generation, and more.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Web3 offers the promise of decentralized and user-controlled applications, but security remains a persistent challenge. Mysten Labs, a company founded by five ex-Meta tech leads, has been leading the charge in AI-powered blockchain security. They created an internal red team that integrated Google's AI models, Codey and Gemini, and leveraged Vertex AI for continuous threat monitoring, exposing critical software bugs within open-source projects. Working with Google Cloud has helped Mysten Labs drive innovation with continuous security audits, bot detection, and stress testing to detect vulnerabilities before malicious hackers do. Join us to discover how General AI can help you build the next generation of secure Web3 applications and see firsthand some of the most surprising vulnerability findings.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Learn how Gemini in Google Meet can make your meetings less tiring and more fulfilling. This session will go through a day’s worth of meetings to see how Gemini can enhance your experience in Meet – from helping you look your best, to automatic translations, to taking notes for you – allowing you to work smarter, not harder. Learn from Trellix who will share best practices in switching to Meet and how to improve productivity, save money, and enhance their security posture.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Discover how Vertex AI Search leverages generative AI including Google Cloud's latest Gemini models to power high quality search experiences grounded in your data. We’ll share how you can increase customer and employee satisfaction with personalized search and recommendations experiences. Learn how Vertex AI Search can help you ground gen AI apps in your data functioning as an information retrieval and answer generation system. Discover the latest product enhancements and discover how customers are transforming search capabilities in the enterprise.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Gemini in Google Chat unlocks exciting new ways to communicate and collaborate. This session will introduce Gemini in Google Chat along with other new collaboration and messaging features that are making Chat more powerful for teams of any size.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

SAP, Grow, and RISE users, transform your business: Leverage Google Cloud's BigQuery, Cortex Framework, Vertex AI and generative AI to unlock data's potential. See how Pluto7 integrates for a smarter supply chain for better customer experience with better demand-sensing, forecasting, planning inventory, reducing defects and more. Turn your SAP and non-SAP data into action, simplify processes and drive efficiency with gen AI and Gemini. Join us to elevate your business with real-time insights and strategic decision-making.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

NVIDIA AI on Google Cloud provides all the essential tools, frameworks, and models needed to develop and deploy custom models and generative AI applications. This Cloud Talk will share the latest NVIDIA technologies available on Google Cloud to rapidly enable enterprises to take their LLMs and generative AI applications from pilot into production. By attending this session, your contact information may be shared with the sponsor for relevant follow up for this event only.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Creating captivating generative AI analytics demos is easy. But building a product that consistently delivers value and handles real-life data complexity is challenging. In fact, only 3%-10% of companies effectively utilize LLMs for production. Learn how Cox 2M, the commercial IoT division of Cox Communications has become able to make smarter, faster business decisions using one of the few production-ready implementations of generative AI. Please note: seating is limited and on a first-come, first served basis; standing areas are available

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Large Language Models (LLMs) have changed the way we interact with information. A base LLM is only aware of the information it was trained on. Retrieval augmented generation (RAG) can address this issue by providing context of additional data sources. In this session, we’ll build a RAG-based LLM application that incorporates external data sources to augment an OSS LLM. We’ll show how to scale the workload with distributed kubernetes compute, and showcase a chatbot agent that gives factual answers.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Harrison Chase is the CEO and Co-founder of LangChain, a company formed around the popular open-source Python/Typescript packages. After studying stats and computer science at Harvard, Harrison also went on to lead the machine learning team at Robust Intelligence (an MLOps company) and the entity linking team at Kensho (a fintech startup).

In this fireside chat, he will discuss how LangChain is making it easier to use large language models (LLMs) to develop context-aware reasoning applications. Leveraging the Google ecosystem, they are testing, evaluating, and observing common patterns for building more complex state machines and agents.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Supercharge your processes with Gemini for Google Workspace. Build no-code solutions easily using AppSheet and create custom solutions integrated with BigQuery and Vertex AI. Learn how generative AI is evolving to help users tackle common workflow scenarios with ease.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Learn how Gemini can accelerate time-to-innovation across your data teams by delivering a unified, intelligent experience that simplifies all aspects of the data journey.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.