As AI reshapes how organisations build, manage, and use APIs, governance has become a critical foundation.\nWithout clear standards, consistency, and alignment across teams, API sprawl slows delivery, weakens quality, and limits the value APIs can bring to AI initiatives.\n\nThis masterclass helps you design a cohesive, scalable API governance framework that works for your team, your line of business, or your entire organisation.\nYou will learn how to turn governance into an enabler — not a constraint — and how to ensure your API ecosystem is ready for AI-driven use cases.\n\nIf you want to improve API quality, design consistency, reuse, discoverability, and lifecycle practices across your organisation, this session is for you.
talk-data.com
Topic
apis
57
tagged
Activity Trend
Top Events
Démonstration live – de la découverte à la publication d’APIs avec l’IA
Formation immersive de 3 heures couvrant les bases du développement web, les API et l’intelligence artificielle et le deep learning, avec des outils accessibles et ludiques. Challenge après challenge, les participants gagnent en autonomie et comprennent le fonctionnement des logiciels.
At Printables.com, we handle billions of requests every month using a fairly simple, Python-based API stack that scales reliably without unnecessary complexity. In this talk, I’ll share how embracing pragmatism over hype helped us avoid overengineering—proving that microservices and complex architectures aren’t always the answer for every challenge. We’ll explore key design choices, real-world bottlenecks, and practical lessons from our journey to build a maintainable, cost-effective system that delivers at scale. Whether you’re growing a startup or managing a mature platform, you’ll gain actionable insights for scaling Python APIs with confidence.
APIs dominate the web, accounting for the majority of all internet traffic. And more AI means more APIs, because they act as an important mechanism to move data into and out of AI applications, AI agents, and large language models (LLMs). So how can you make sure all of these APIs are secure? In this session, we’ll take you through OWASP’s top 10 API and LLM security risks, and show you how to mitigate these risks using Google Cloud’s security portfolio, including Apigee, Model Armor, Cloud Armor, Google Security Operations, and Security Command Center.
Learn how to manage security controls and licenses for thousands of users, and tie it all together with APIs. We’ll show you ways to manage developer access more efficiently, build custom management integrations, and keep your CISO happy at the same time. We’ll also demo the new Gemini Code Assist integration with Apigee, which lets developers use Gemini Code Assist chat to generate context-aware OpenAPI specifications that reuse components from other APIs in their organization for efficiency and reference organizational security standards.
In today’s complex landscape, APIs can reside anywhere – on premises, in the cloud, or across multiple cloud providers. Join us to discover how API Hub, powered by its innovative on-ramp framework and app integration, delivers a truly unified view of your entire API ecosystem. This session is for practitioners looking to learn how to better discover, manage, and secure all their APIs, regardless of location, with comprehensive analytics and consistent governance policies.
Organizations are racing to deploy generative AI solutions built on large language models (LLMs), but struggle with management, security, and scalability. Apigee is here to help. Join us to discover how the latest Apigee updates enable you to manage and scale gen AI at the enterprise level. Learn from Google’s own experience and our work with leading customers to address the challenges of productionizing gen AI.
Gemini 2.0, the latest foundational model released by Google DeepMind, offers improved performance, real-time interactions support, text-to-image and text-to-audio generations, Google Search grounding, and reasoning – all under a unified SDK that allows you to flawlessly navigate from the Gemini API to Vertex AI. In this talk, you’ll learn about the newest Gemini 2.0 capabilities, how to accelerate your prototyping, and guidelines to deploy your solutions from a single API to more complex pipelines.
In today's rapidly evolving digital landscape, modernizing mainframe systems is crucial for maintaining competitive advantage. This joint solution explores the transformative potential of migrating data warehouse extracts to Google Cloud Platform (GCP) and BigQuery. Utilizing the flexible framework built, organizations can achieve a flexible, integrated solution that accelerates time to market, ensures accurate and timely real-time data for reporting and advanced analytics, and provides self-service access. This digital transformation not only empowers stakeholders with enhanced capabilities but also significantly improves the overall customer experience.
This Session is hosted by a Google Cloud Next Sponsor.
Visit your registration profile at g.co/cloudnext to opt out of sharing your contact information with the sponsor hosting this session.
The payments landscape is being redefined. Stripe’s $1 billion acquisition of Bridge signals a seismic shift toward stablecoins for business-to-business transactions. Join industry leaders to unpack the explosive growth of stablecoins, the challenges to adoption, and the future of blockchain-based payments. Will a single stablecoin dominate, or will a multicoin ecosystem prevail? Don’t miss this insightful discussion shaping the future of money.
Organizations excel at building software, but they often struggle with the complexity of delivering it to production. This session introduces Software Logistics, a crucial discipline within Platform Engineering, focusing on the journey from package to runtime. Learn how to build delivery pipelines that handle modern cloud-native applications through practical patterns, real-world examples, and essential frameworks. Discover how customers implement Software Logistics practices to achieve reliable, secure, and efficient delivery of applications.
This Session is hosted by a Google Cloud Next Sponsor.
Visit your registration profile at g.co/cloudnext to opt out of sharing your contact information with the sponsor hosting this session.
As the number of distributed and generative AI applications grows in your organization, so will the number of APIs – presenting management and governance challenges. This session introduces API Hub, a solution for unified API management. Learn how to gain centralized observability over your APIs, use the API Hub on-ramp framework to connect to any API gateway, help developers access shared components and standards to build compliant APIs faster, and discover and manage shadow APIs.
This session explores the business drivers for ESG data in the current business climate and delves into the technical architecture of a robust ESG data management solution. Discover how to integrate AI and data agents with core enterprise systems, like SAP and Cortex, and incorporate crucial third-party ESG data from sources like ESG Book to enhance business decision-making. We will also showcase real-world examples of how to integrate Google Cloud with platforms like Watershed to build a complete ESG data ecosystem.
This Session is hosted by a Google Cloud Next Sponsor.
Visit your registration profile at g.co/cloudnext to opt out of sharing your contact information with the sponsor hosting this session.
Jeff, Head of Product at Hugging Face, will walk you through the latest and greatest from Hugging Face to build your own AI models, agents and applications using open models, open source and Google Cloud. Learn how to easily deploy the latest LLM on Vertex AI and GKE, how to accelerate and scale your models on TPU.
This hands-on lab equips you with the practical skills to build and deploy a real-world AI-powered chat application leveraging the Gemini LLM APIs. You'll learn to containerize your application using Cloud Build, deploy it seamlessly to Cloud Run, and explore how to interact with the Gemini LLM to generate insightful responses. This hands-on experience will provide you with a solid foundation for developing engaging and interactive conversational applications.
If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!
This hands-on lab equips you with the practical skills to build and deploy a real-world AI-powered chat application leveraging the Gemini LLM APIs. You'll learn to containerize your application using Cloud Build, deploy it seamlessly to Cloud Run, and explore how to interact with the Gemini LLM to generate insightful responses. This hands-on experience will provide you with a solid foundation for developing engaging and interactive conversational applications.
If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!
Learn how to use Earth Observation data from Earth Engine to enrich your data in BigQuery. We will take a look at how to do this with Python, user defined functions and the new ST_RegionStats function in BigQuery.
Explore how AI agents and assistants are transforming Software Logistics, bringing intelligence to cloud application delivery. This session demonstrates practical applications of AI in deployment automation, configuration management, and runtime optimization. Discover how Platform Engineers leverage AI to create more efficient delivery systems. Learn design strategies from automated security scanning to intelligent deployment decisions, and how to integrate these capabilities into your existing platform engineering practices.
This Session is hosted by a Google Cloud Next Sponsor.
Visit your registration profile at g.co/cloudnext to opt out of sharing your contact information with the sponsor hosting this session.
Discover and learn about the service approach, which mean to expose capabilities through API for a company using Generative AI for content creation.
Orchestration, scalability and micro services approach will be exposed during this session, to show how you can industrialize at scale the services to serve millions of users.