talk-data.com talk-data.com

Topic

MLOps

machine_learning devops ai

6

tagged

Activity Trend

26 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Google Cloud Next '24 ×

Continuous Deployment can be a roadblock in the MLOps lifecycle, often requiring custom pipelines and complex configurations. Solution? The new integrations of Google Cloud Deploy and Vertex AI revolutionizes machine learning (ML) deployment by automating the entire process, and makes it easy to roll back through idempotent releases. The groundbreaking integration of Cloud Deploy and Vertex AI lets you test, validate, and deploy your ML models in minutes, without writing a single line of code.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Application development in continuously breaking down silos. From Dev, DevTest, DevOps, DevSecOps, MLOps, Analytics, to… DevAI? Developers are now being thrust into the dynamic arena of real-time analytics and generative AI (GenAI): two forces shaping the next iteration of technology. This session dives deep into this intersection, demonstrating how developers can leverage these revolutionary tools to not only build applications, but craft game-changing business strategies.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

The emergence of foundation models and generative AI has introduced a new era for building AI systems. Selecting the right model from a range of architectures and sizes, curating data, engineering optimal prompts, tuning models for specific tasks, grounding model outputs in real-world data, optimizing hardware – these are just a few of the novel challenges that large models introduce. Delve into the fundamental tenets of MLOps, the necessary adaptations required for generative AI, and capabilities within Vertex AI to support this new workflow.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

If you’re a data engineer, MLOps engineer or procurement officer planning to purchase third-party AI models, you won’t want to miss this. Learn how you can speed assessment, facilitate procurement, and simplify governance of AI models (including generative AI) on Google Cloud Marketplace. Explore how to easily procure and deploy third-party AI models and frameworks to both Vertex AI and Google Kubernetes Engine. Finally, you’ll learn from Anthropic, who dive into how their solution deploys via Marketplace to Vertex AI.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Harrison Chase is the CEO and Co-founder of LangChain, a company formed around the popular open-source Python/Typescript packages. After studying stats and computer science at Harvard, Harrison also went on to lead the machine learning team at Robust Intelligence (an MLOps company) and the entity linking team at Kensho (a fintech startup).

In this fireside chat, he will discuss how LangChain is making it easier to use large language models (LLMs) to develop context-aware reasoning applications. Leveraging the Google ecosystem, they are testing, evaluating, and observing common patterns for building more complex state machines and agents.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.

Explore 130+ models on Model Garden and learn how Vertex AI supports innovation of new generative AI apps. Learn how to optimize MLOps practices, assess different prompts and model responses, and compare different model variants.

Click the blue “Learn more” button above to tap into special offers designed to help you implement what you are learning at Google Cloud Next 25.