talk-data.com talk-data.com

Topic

postgresql

50

tagged

Activity Trend

6 peak/qtr
2020-Q1 2026-Q1

Activities

50 activities · Newest first

Extending SQL Databases with Python

What if your database could run Python code inside SQL? In this talk, we’ll explore how to extend popular databases using Python, without needing to write a line of C.

We’ll cover three systems—SQLite, DuckDB, and PostgreSQL—and show how Python can be used in each to build custom SQL functions, accelerate data workflows, and prototype analytical logic. Each database offers a unique integration path: - SQLite and DuckDB allow you to register Python functions directly into SQL via sqlite3.create_function, making it easy to inject business logic or custom transformations. - PostgreSQL offers PL/Python, a full-featured procedural language for writing SQL functions in Python. We’ll also touch on advanced use cases, including embedding the Python interpreter directly into a PostgreSQL extension for deeper integration.

By the end of this talk, you’ll understand the capabilities, limitations, and gotchas of Python-powered extensions in each system—and how to choose the right tool depending on your use case, whether you’re analyzing data, building pipelines, or hacking on your own database.

AWS re:Invent 2025 - A tale of two transactions (DAT455)

If you are curious how transactions are processed differently in Amazon Aurora DSQL versus Amazon Aurora PostgreSQL, join Marc Brooker, AWS VP and Distinguished Engineer, as he dives into how transactions are processed in these two databases – right down to the code level. You’ll understand how each database handles multiple transactions accessing or changing the data at the same time, why they use different approaches, and what that means for your architectures. You’ll leave with a deep intuition for what makes database workloads scale and ability to think about how to apply those lessons to your architectures and code.

Learn more: More AWS events: https://go.aws/3kss9CP

Subscribe: More AWS videos: http://bit.ly/2O3zS75 More AWS events videos: http://bit.ly/316g9t4

ABOUT AWS: Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world's most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.

AWSreInvent #AWSreInvent2025 #AWS

AWS re:Invent 2025 - Build serverless chatbots using Amazon ElastiCache & Aurora PostgreSQL (DAT314)

In this session, learn how to build a serverless-first chatbot application by leveraging Amazon Aurora with pgvector as a vector store and Amazon ElastiCache Serverless for Valkey to manage chat history, semantic caching, and agent memory. Discover how to harness these services to create contextually aware, scalable chatbots with agentic capabilities that retain conversational memory across interactions. Explore architectural patterns, implementation strategies, and real-world use cases. Gain insights into optimizing query latency, performance, and cost-effectiveness. Whether you’re a developer, architect, or decision-maker, this session equips you with practical knowledge to build modern, serverless-powered chatbot applications using generative AI.

Learn more: More AWS events: https://go.aws/3kss9CP

Subscribe: More AWS videos: http://bit.ly/2O3zS75 More AWS events videos: http://bit.ly/316g9t4

ABOUT AWS: Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world's most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.

AWSreInvent #AWSreInvent2025 #AWS

AWS re:Invent 2025 - Amazon Aurora HA and DR design patterns for global resilience (DAT442)

Amazon Aurora is a serverless relational database with unparalleled high performance and availability at global scale for PostgreSQL, MySQL, and DSQL. Aurora provides managed high availability (HA) and disaster recovery (DR) capabilities in and across AWS Regions. In this session, explore the Aurora HA and DR capabilities and discover design patterns that enable the development of resilient applications. Learn how to establish in-Region and cross-Region HA and DR using Aurora features including Multi-AZ deployments and Aurora Global Database, and how Aurora DSQL multi-Region clusters provides the highest level of availability and application resilience.

Learn more: More AWS events: https://go.aws/3kss9CP

Subscribe: More AWS videos: http://bit.ly/2O3zS75 More AWS events videos: http://bit.ly/316g9t4

ABOUT AWS: Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world's most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.

AWSreInvent #AWSreInvent2025 #AWS

AWS re:Invent 2025 - Deep dive into Amazon Aurora and its innovations (DAT441)

With an innovative architecture that decouples compute from storage and advanced features like Amazon Aurora Global Database and low-latency read replicas, Aurora reimagines what it means to be a relational database. Aurora is a built-for-the-cloud, serverless relational database service that delivers unparalleled performance and availability at global scale for MySQL, PostgreSQL, and DSQL. In this session, dive deep into new features – and Aurora’s most popular offerings including serverless, I/O-Optimized, zero-ETL integrations, MCP integration, and generative AI support for vector search and storage. Also learn about the groundbreaking Aurora DSQL engine.

Learn more: More AWS events: https://go.aws/3kss9CP

Subscribe: More AWS videos: http://bit.ly/2O3zS75 More AWS events videos: http://bit.ly/316g9t4

ABOUT AWS: Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world's most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.

AWSreInvent #AWSreInvent2025 #AWS

AWS re:Invent 2025 - PostgreSQL performance: Real-world workload tuning (DAT410)

PostgreSQL performance isn’t magic—it’s engineering. In this code talk, we dive deep into practical tuning techniques to avoid common pitfalls that silently degrade performance by improving an underperforming PostgreSQL database workload. Learn how excessive indexes hurt write throughput, why HOT updates fail, and how vacuum behavior can stall your system. We’ll demonstrate how to use Query Plan Management (QPM) and pg_hint_plan for plan stability and decode wait events to uncover hidden bottlenecks. With real SQL examples and system insights, this session equips you to tune PostgreSQL for predictable, high-performance workloads.

Learn more: More AWS events: https://go.aws/3kss9CP

Subscribe: More AWS videos: http://bit.ly/2O3zS75 More AWS events videos: http://bit.ly/316g9t4

ABOUT AWS: Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world's most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.

AWSreInvent #AWSreInvent2025 #AWS

AWS re:Invent 2025 - Amazon Aurora DSQL: A developer's perspective  (DAT401)

In this live coding session, we'll show you how to work with Amazon Aurora DSQL from a developer's perspective. We'll develop a sample application to highlight some of the ways developing for Aurora DSQL is different than PostgreSQL. We'll cover authentication and connection management, optimistic concurrency transaction patterns, primary key selection, analyzing query performance, and best practices.

Learn more: More AWS events: https://go.aws/3kss9CP

Subscribe: More AWS videos: http://bit.ly/2O3zS75 More AWS events videos: http://bit.ly/316g9t4

ABOUT AWS: Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world's most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.

AWSreInvent #AWSreInvent2025 #AWS

Optimizing performance, deployments, and security for Linux on Azure

Whether you use Ubuntu, RHEL, SLES, or Rocky, deploying Linux on Azure is more than provisioning VMs. Learn to build secure, performant Linux environments using Azure-native tools and partner integrations. See how to streamline image creation, harden workloads, monitor performance, and stay compliant with Azure Monitor, Defender for Linux, PostgreSQL on Azure, and the secure Linux baseline. Whether architect or OSS advocate, this session helps you confidently “train your penguin” for the cloud.

AI-assisted migration: The path to powerful performance on PostgreSQL

Apollo Hospitals unlocked faster performance and sped up innovation by migrating to Azure Database for PostgreSQL. Ready to move beyond your legacy databases? Join us to learn how to make the move seamless with AI-assisted migration for Oracle workloads. Plus, hear how a move to Azure enables innovation and explore the latest features that deliver the resilience and extensibility needed to bolster your databases to support more intelligent applications and demanding workloads.

Nasdaq Boardvantage: AI-driven governance on PostgreSQL and Microsoft Foundry

Trusted by nearly half of Fortune 100 companies, Nasdaq Boardvantage powers secure, intelligent board operations. In this deep dive session, explore how Azure Database for PostgreSQL and MySQL, Microsoft Foundry, Azure Kubernetes Service (AKS), and API Management create a resilient architecture that safeguards confidential data while unlocking new agentic AI capabilities.

The blueprint for intelligent AI agents backed by PostgreSQL

AI agents are reshaping business operations across sectors. If you’re looking to bolster your business with agents, this session offers a strategic approach for building intelligent agents that truly understand your data. Learn how to uncover next-level AI reasoning for your agents using knowledge graphs, vector search and the latest AI integrations directly in PostgreSQL on Azure. Leave with a blueprint and best practices from Azure customers you can use to augment your own apps and copilots.

Azure HorizonDB: Deep Dive into a New Enterprise-Scale PostgreSQL

Azure HorizonDB is Azure’s new fully managed PostgreSQL database for mission-critical workloads. Join engineering and product leaders to explore the architecture that enables enhanced scalability without sacrificing performance, ultra-fast failovers, and read replicas for constant availability. Learn how HorizonDB combines open-source Postgres with state-of-the-art Azure infrastructure, cutting-edge database innovation and advanced AI features to power next-generation applications.

Use Azure Migrate for AI assisted insights and cloud transformation

Discover how you can make the most of your IT estate migrations and modernizations with the newest AI capabilities. This session guides IT teams through assessing current environments, setting goals, and creating a business case with Azure Migrate for all of your workload types like Windows Server, SQL Server, .NET, Linux, PostgreSQL, Java, and more. We’ll explore tools to inventory workloads, map dependencies, and create actionable migration roadmaps.

Innovation Session: Microsoft Fabric and Azure Databases - the data estate for AI

In today’s AI-driven economy, data isn’t just an asset, it’s your differentiator. Join us for major announcements across Microsoft Fabric, Power BI, Azure SQL, and Azure PostgreSQL, and see how these innovations come together to deliver a unified, intelligent data foundation for AI. Experience customer success stories, live demos, and an inside look at the roadmap shaping the future of analytics, transactional workloads, and real-time insights all in one integrated experience.

Model Context Protocol: Principles and Practice

Large‑language‑model agents are only as useful as the context and tools they can reach.

Anthropic’s Model Context Protocol (MCP) proposes a universal, bidirectional interface that turns every external system—SQL databases, Slack, Git, web browsers, even your local file‑system—into first‑class “context providers.”

In just 30 minutes we’ll step from high‑level buzzwords to hands‑on engineering details:

  • How MCP’s JSON‑RPC message format, streaming channels, and version‑negotiation work under the hood.
  • Why per‑tool sandboxing via isolated client processes hardens security (and what happens when an LLM tries rm ‑rf /).
  • Techniques for hierarchical context retrieval that stretch a model’s effective window beyond token limits.
  • Real‑world patterns for accessing multiple tools—Postgres, Slack, GitHub—and plugging MCP into GenAI applications.

Expect code snippets and lessons from early adoption.

You’ll leave ready to wire your own services into any MCP‑aware model and level‑up your GenAI applications—without the N×M integration nightmare.

When Postgres is enough: solving document storage, pub/sub and distributed queues without more tools

When a new requirement appears, whether it's document storage, pub/sub messaging, distributed queues, or even full-text search, Postgres can often handle it without introducing more infrastructure.

This talk explores how to leverage Postgres' native features like JSONB, LISTEN/NOTIFY, queueing patterns and vector extensions to build robust, scalable systems without increasing infrastructure complexity.

You'll learn practical patterns that extend Postgres just far enough, keeping systems simpler, more maintainable, and easier to operate, especially in small to medium projects or freelancing setups, where Postgres often already forms a critical part of the stack.

Postgres might not replace everything forever - but it can often get you much further than you think.

Flying Beyond Keywords: Our Aviation Semantic Search Journey

In aviation, search isn’t simple—people use abbreviations, slang, and technical terms that make exact matching tricky. We started with just Postgres, aiming for something that worked. Over time, we upgraded: semantic embeddings, reranking. We tackled filter complexity, slow index builds, and embedding updates and much more. Along the way, we learned a lot about making AI search fast, accurate, and actually usable for our users. It’s been a journey—full of turbulence, but worth the landing.

Sponsored by: Anomalo | Reconciling IoT, Policy, and Insurer Data to Deliver Better Customer Discounts

As insurers increasingly leverage IoT data to personalize policy pricing, reconciling disparate datasets across devices, policies, and insurers becomes mission-critical. In this session, learn how Nationwide transitioned from prototype workflows in Dataiku to a hardened data stack on Databricks, enabling scalable data governance and high-impact analytics. Discover how the team orchestrates data reconciliation across Postgres, Oracle, and Databricks to align customer driving behavior with insurer and policy data—ensuring more accurate, fair discounts for policyholders. With Anomalo’s automated monitoring layered on top, Nationwide ensures data quality at scale while empowering business units to define custom logic for proactive stewardship. We’ll also look ahead to how these foundations are preparing the enterprise for unstructured data and GenAI initiatives.

Race to Real-Time: Low-Latency Streaming ETL Meets Next-Gen Databricks OLTP-DB

In today’s digital economy, real-time insights and rapid responsiveness are paramount to delivering exceptional user experiences and lowering TCO. In this session, discover a pioneering approach that leverages a low-latency streaming ETL pipeline built with Spark Structured Streaming and Databricks’ new OLTP-DB—a serverless, managed Postgres offering designed for transactional workloads. Validated in a live customer scenario, this architecture achieves sub-2 second end-to-end latency by seamlessly ingesting streaming data from Kinesis and merging it into OLTP-DB. This breakthrough not only enhances performance and scalability but also provides a replicable blueprint for transforming data pipelines across various verticals. Join us as we delve into the advanced optimization techniques and best practices that underpin this innovation, demonstrating how Databricks’ next-generation solutions can revolutionize real-time data processing and unlock a myriad of new use cases in data landscape.

Master Schema Translations in the Era of Open Data Lake

Unity Catalog puts variety of schemas into a centralized repository, now the developer community wants more productivity and automation for schema inference, translation, evolution and optimization especially for the scenarios of ingestion and reverse-ETL with more code generations.Coinbase Data Platform attempts to pave a path with "Schemaster" to interact with data catalog with the (proposed) metadata model to make schema translation and evolution more manageable across some of the popular systems, such as Delta, Iceberg, Snowflake, Kafka, MongoDB, DynamoDB, Postgres...This Lighting Talk covers 4 areas: The complexity and caveats of schema differences among The proposed field-level metadata model, and 2 translation patterns: point-to-point vs hub-and-spoke Why Data Profiling be augmented to enhance schema understanding and translation Integrate it with Ingestion & Reverse-ETL in a Databricks-oriented eco system Takeaway: standardize schema lineage & translation