On any given day, millions of small businesses rely on Google Workspace to connect, create, and collaborate. Join our panel of SMB leaders as they share how their teams use Workspace to punch above their weight – from iterating faster with real-time collaboration in Docs, Sheets, and Slides to expanding their capabilities with Gemini.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
talk-data.com
Topic
LLM
Large Language Models (LLM)
1405
tagged
Activity Trend
Top Events
Join us to learn customer-proven strategies for AI-powered innovation and modernization, leveraging the best of public cloud, edge, sovereign, and cross-cloud infrastructure and services. See how leaders are achieving real business outcomes faster and more cost-effectively in industries like retail, healthcare, and even the most stringent regulated industries. Plus, see how Gemini is dramatically simplifying and reimagining cloud as we know it.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Join us in this fireside chat with Phillipp Schmid, Technical Lead of Hugging Face, a collaboration platform for the machine learning community where anyone can share, explore, discover, and experiment with open-source ML. Phillipp will talk about the benefits of open source going into production with generative AI applications – everything from versioning, evaluating, monitoring, and data drift. He will highlight the challenges of evaluating large language models (LLMs), including what works today and where we need to improve. Join us for his thoughts on the role of cloud computing in accelerating AI and how Hugging Face is leveraging it to build an open, ethical, and collaborative AI future.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Gemini in Looker has made the most complex data actions simple, letting you chat with your business data. The latest advancements in Google's business intelligence (BI) suite bring insights to your users and customers, and form the basis of your own data-driven applications. This session will show you how we are building the future of BI with AI at the center, and keeping our focus on an open ecosystem that enables you to bring all your important data and share it with your teams – all driven by generative AI.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Curious to understand how Gemini in Gmail, Google Drive, and other Google Workspace apps can ground their generated responses based on your documents and emails while keeping your organization’s data private? Still have questions about what Gemini does and does not do with your data? Join this session to get answers to your questions, learn more about the built-in privacy and security controls in Gemini for Google Workspace, and understand how your organization can achieve digital sovereignty with Sovereign Controls.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Google stands at the forefront of innovation and AI across its many divisions. Hear from Google executives about the impact of the Gemini Era, how AI is shaping the future of business and how enterprise executives can boldly approach innovation.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Generative AI can transform customer service, enhance employee productivity, automate business processes, and more. And by using Serverless, Google Cloud's “pay-as-you-go” compute platform to provide these experiences, you can focus on what's core to your business and leave the autoscaling to Google. In this talk, you'll learn how to run Google Cloud generative AI tools on Serverless, including how to use the Vertex AI Gemini API, how to use function calling to supplement a gen AI model with Serverless endpoints, and more.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
It can be challenging to know where and how to start building with Generative AI. Join this introductory workshop to learn about the tools and techniques needs to get started building with foundation models. We show you how to experiment with Gemini in the Cloud console and evaluate the performance of Gemini for your specific use case.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Gemini is a family of generative AI models developed by Google DeepMind that is designed for multimodal use cases. The Gemini API gives you access to the Gemini Pro Vision and Gemini Pro models. In this spotlight lab, you will learn how to use the Vertex AI Gemini API with the Vertex AI SDK for Python to interact with the Gemini Pro (gemini-pro) model and the Gemini Pro Vision (gemini-pro-vision) model.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
At Google I/O last year, we announced several new AI Firebase Extensions using the PaLM API. This year, we’ve added support for Google's latest Gemini models. Easily add a chatbot, text summarizer, content generator, vector database pipeline, and more to your app without learning new APIs. In this session, get an end-to-end view of how you can use Firebase and Gemini to create an enterprise-ready customer support app. Build many apps with the powerful combo of Gemini's multimodal features and Firebase's convenient suite of developer tools.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
As machine learning (ML) systems continue to evolve, the ability to scale complex ML workloads becomes crucial. Scalability can be considered along two dimensions: expansive training of large language models (LLMs) and intricate distribution of reinforcement learning (RL) systems. Each has its own set of challenges, from computational demands of LLMs to complex synchronization in distributed RL.
This session explores the integration of Ray, Google Kubernetes Engine (GKE) and ML accelerators like tensor processing units (TPUs) as a powerful combination to develop advanced ML systems at scale. We discuss Ray and its scalable APIs, its mature integration with GKE and ML accelerators, and demonstrate how it has been used for LLMs and re-implementing the powerful RL algorithm, Muzero.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
A day in the life of a Google Cloud developer typically involves the use of multiple Google Cloud products and services. These products enable the developer to develop, test, deploy, and manage applications in the cloud. With assistance from Gemini, a developer can become more productive when using Google Cloud's products by using Gemini's interactive chat, code assistance, and embedded integrations. In this spotlight lab you will explore Gemini in an hands-on lab environment to see the different ways in which Gemini can be used in your development workflows.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Coinbase is partnering with Google Cloud to propel their company to the forefront of AI innovation. Coinbase is poised to implement dozens of GenAI use cases this year, to augment the journeys of its employees and customers. As GenAI technology continues to evolve, Google's Model Garden offers Coinbase the flexibility to seamlessly integrate fundamental models like Gemini alongside third-party ones, empowering both their business and users. Join us to find out how VertexAI is helping Coinbase revolutionize the crypto industry, unlocking new possibilities and redefining the future of finance.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Gemini is a contextual, real-time assistant baked right into Workspace — that lets you become a better writer, visual designer, data analyst, and project manager. In this session, discover how Gemini for Google Workspace empowers your teams to build more creative campaigns, maximize follow-through with prospects, and accelerate business opportunities. See real-world use cases showcasing how Gemini automates tasks, boosts efficiency, and frees time to focus on higher value work – driving greater impact, revenue, and ROI.
Please note: seating is limited and on a first-come, first served basis; standing areas are available
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Simplify and automate Application Programming Interface (API) development and integration with Gemini. Design and build APIs faster, connect any application, and reduce errors with AI-powered insights.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Explore the orchestration of Google Cloud technologies behind the AI Penalty Challenge. Discover how Gemini's language capabilities on Vertex AI, Firestore's data management, Android's device integration, and the power of Google Cloud work in unison. Learn problem-solving strategies for building scalable, AI-powered experiences.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Working with the agility and flexibility of a digital native or a big tech innovator can feel like an impossible mountain to climb for traditional enterprises. But empowering your teams with the right tools, like generative AI, can make this vision a reality. Join this session to learn how your organization can adopt a more collaborative mindset and begin to identify your road map to embracing generative AI. Hear from two customers who have embarked on their journey of a new way to work and glean insights from their lessons learned in adopting Gemini AI in Google Workspace.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Explore your Looker data with natural language. This session dives into our open-source generative AI Looker extension, powered by Vertex AI large language models (LLM). Learn how to: - Ask questions using natural language: Explore data and gain insights intuitively - Deploy and manage: Understand the extension's architecture and set it up for your needs - Customize the extension: Change prompts if needed, give more examples, or fine-tune the LLM model for tailored results
Unleash the power of generative AI for data-driven decision making.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
Traditional infrastructure is no longer adequate for the exponentially growing demands of generative AI and LLMs. Join this session to learn how infrastructure design is meeting those demands, how organizations are adapting to capitalize on the new infrastructure landscape, and how this may evolve in future.
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.
In this workshop, you will learn how you can easily create a Retrieval Augmented Generation (RAG) application and how to use it. We will be highlighting AlloyDB Omni (our deploy-anywhere version of AlloyDB) with pgvector's vector search capabilities. You will learn to run an LLM and embedding model locally so that you can run this application anywhere. Creating an app in a secure way with LLMs playing around your data is harder than ever. Come and build with me!
Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.