talk-data.com talk-data.com

Topic

Cyber Security

cybersecurity information_security data_security privacy

85

tagged

Activity Trend

297 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Data + AI Summit 2025 ×
Low-Emission Oil & Gas: Engineering the Balance Between Clean and Reliable

Join two energy industry leaders as they showcase groundbreaking applications of AI and data solutions in modern oil and gas operations. NOV demonstrates how their Generative AI pipeline revolutionized drilling mud report processing, automating the analysis of 300 reports daily with near-perfect accuracy and real-time analytics capabilities. BP shares how Unity Catalog has transformed their enterprise-wide data strategy, breaking down silos while maintaining robust governance and security. Together, these case studies illustrate how AI and advanced analytics are enabling cleaner, more efficient energy operations while maintaining the reliability demanded by today's market.

Securely Deploying AI/BI to All Users in Your Enterprise

Bringing AI/BI to every business user starts with getting security, access and governance right. In this session, we’ll walk through the latest best practices for configuring Databricks accounts, setting up workspaces, and managing authentication protocols to enable secure and scalable onboarding. Whether you're supporting a small team or an entire enterprise, you'll gain practical insights to protect your data while ensuring seamless and governed access to AI/BI tools.

Supercharging Sales Intelligence: Processing Billions of Events via Structured Streaming

DigiCert is a digital security company that provides digital certificates, encryption and authentication services and serves 88% of the Fortune 500, securing over 28 billion web connections daily. Our project aggregates and analyzes certificate transparency logs via public APIs to provide comprehensive market and competitive intelligence. Instead of relying on third-party providers with limited data, our project gives full control, deeper insights and automation. Databricks has helped us reliably poll public APIs in a scalable manner that fetches millions of events daily, deduplicate and store them in our Delta tables. We specifically use Spark for parallel processing, structured streaming for real-time ingestion and deduplication, Delta tables for data reliability, pools and jobs to ensure our costs are optimized. These technologies help us keep our data fresh, accurate and cost effective. This data has helped our sales team with real-time intelligence, ensuring DigiCert's success.

Sponsored by: Dagster Labs | The Age of AI is Changing Data Engineering for Good

The last major shift in data engineering came during the rise of the cloud, transforming how we store, manage, and analyze data. Today, we stand at the cusp of the next revolution: AI-driven data engineering. This shift promises not just faster pipelines, but a fundamental change in the way data systems are designed and maintained. AI will redefine who builds data infrastructure, automating routine tasks, enabling more teams to contribute to data platforms, and (if done right) freeing up engineers to focus on higher-value work. However, this transformation also brings heightened pressure around governance, risk, and data security, requiring new approaches to control and oversight. For those prepared, this is a moment of immense opportunity – a chance to embrace a future of smarter, faster, and more responsive data systems.

IQVIA’s Serverless Journey: Enabling Data and AI in a Regulated World

Your data and AI use-cases are multiplying. At the same time, there is increased focus and scrutiny to meet sophisticated security and regulatory requirements. IQVIA utilizes serverless use-cases across data engineering, data analytics, and ML and AI, to empower their customers to make informed decisions, support their R&D processes and improve patient outcomes. By leveraging native controls on the platform, serverless enables them to streamline their use cases while maintaining a strong security posture, top performance and optimized costs. This session will go over IQVIA’s journey to serverless, how they met their security and regulatory requirements, and the latest and upcoming enhancements to the Databricks Platform.

Sponsored by: Immuta | Protecting People Data: How Shell Empowers HR to Drive a Brighter Future

HR departments increasingly rely on data to improve workforce planning and experiences. However, managing and getting value from this data can be challenging, especially given the complex technology landscape and the need to ensure data security and compliance. Shell has placed a high priority on safeguarding its people data while empowering its HR department with the tools and access they need to make informed decisions. This session will explore the transformation of Shell's Central Data Platform, starting with their HR use case. You’ll hear about:- The role of automation and data governance, quality, and literacy in Shell’s strategy.- Why they chose Databricks and Immuta for enhanced policy-based access control.- The future for Shell and their vision for a data marketplace to truly embrace a culture of global data sharing.The result? A robust, scalable HR Data Platform that is securely driving a brighter future for Shell and its employees.

Building Responsible AI Agents on Databricks

This presentation explores how Databricks' Data Intelligence Platform supports the development and deployment of responsible AI in credit decisioning, ensuring fairness, transparency and regulatory compliance. Key areas include bias and fairness monitoring using Lakehouse Monitoring to track demographic metrics and automated alerts for fairness thresholds. Transparency and explainability are enhanced through the Mosaic AI Agent Framework, SHAP values and LIME for feature importance auditing. Regulatory alignment is achieved via Unity Catalog for data lineage and AIBI dashboards for compliance monitoring. Additionally, LLM reliability and security are ensured through AI guardrails and synthetic datasets to validate model outputs and prevent discriminatory patterns. The platform integrates real-time SME and user feedback via Databricks Apps and AI/BI Genie Space.

Databricks in Action: Azure’s Blueprint for Secure and Cost-Effective Operations

Erste Group's transition to Azure Databricks marked a significant upgrade from a legacy system to a secure, scalable and cost-effective cloud platform. The initial architecture, characterized by a complex hub-spoke design and stringent compliance regulations, was replaced with a more efficient solution. The phased migration addressed high network costs and operational inefficiencies, resulting in a 60% reduction in networking costs and a 30% reduction in compute costs for the central team. This transformation, completed over a year, now supports real-time analytics, advanced machine learning and GenAI while ensuring compliance with European regulations. The new platform features a Unity Catalogue, separate data catalogs and dedicated workspaces, demonstrating a successful shift to a cloud-based machine learning environment with significant improvements in cost, performance and security.

Optimizing Smart Meter IIoT Data in Databricks for At-Scale Interactive Electrical Load Analytics

Octave is a Plotly Dash application used daily by about 1,000 Hydro-Québec technicians and engineers to analyze smart meter load and voltage data from 4.5M meters across the province. As adoption grew, Octave’s back end was migrated to Databricks to address increasingly massive scale (>1T data points), governance and security requirements. This talk will summarize how Databricks was optimized to support performant at-scale interactive Dash application experiences while in parallel managing complex back-end ETL processes. The talk will outline optimizations targeted to further optimize query latency and user concurrency, along with plans to increase data update frequency. Non-technology related success factors to be reviewed will include the value of: subject matter expertise, operational autonomy, code quality for long-term maintainability and proactive vendor technical support.

Real-Time Botnet Defense at CVS: AI-Driven Detection and Mitigation on Databricks

Botnet attacks mobilize digital armies of compromised devices that continuously evolve, challenging traditional security frameworks with their high-speed, high-volume nature. In this session, we will reveal our advanced system — developed on the Databricks platform — that leverages cutting-edge AI/ML capabilities to detect and mitigate bot attacks in near-real time. We will dive into the system’s robust architecture, including scalable data ingestion, feature engineering, MLOps strategies & production deployment of the system. We will address the unique challenges of processing bulk HTTP traffic data, time-series anomaly detection and attack signature identification. We will demonstrate key business values through downtime minimization and threat response automation. With sectors like healthcare facing heightened risks, ensuring data integrity and service continuity is vital. Join us to uncover lessons learned while building an enterprise-grade solution that stays ahead of adversaries.

Sponsored by: Dataiku | Agility Meets Governance: How Morgan Stanley Scales ML in a Regulated World

In regulated industries like finance, agility can't come at the cost of compliance. Morgan Stanley found the answer in combining Dataiku and Databricks to create a governed, collaborative ecosystem for machine learning and predictive analytics. This session explores how the firm accelerated model development and decision-making, reducing time-to-insight by 50% while maintaining full audit readiness. Learn how no-code workflows empowered business users, while scalable infrastructure powered Terabyte-scale ML. Discover best practices for unified data governance, risk automation, and cross-functional collaboration that unlock innovation without compromising security. Ideal for data leaders and ML practitioners in regulated industries looking to harmonize speed, control, and value.

Sponsored by: Skyflow | How to govern a billion sensitive records in your CDP

Customer Data Platforms (CDPs) promise better engagement, higher operational efficiency, and revenue growth by centralizing and streamlining access to customer data. However, consolidating sensitive information from a variety of sources creates complex challenges around data governance, security, and privacy. We’ve studied, built, and managed data protection strategies at some of the world’s biggest retailers. We’ll showcase business requirements, common architectural components, and best practices to deploy data protection solutions at scale, protecting billions of sensitive records across regions and countries. Learn how a data vault pattern with granular, policy-based access control and monitoring can improve organizational privacy posture and help meet regulatory requirements (e.g., GDPR, CCPA, e-Privacy). Walk away with a clear framework to deploy such architecture and knowledge of real-world issues, performance optimizations, and design trade-offs

Building Tool-Calling Agents With Databricks Agent Framework and MCP

Want to create AI agents that can do more than just generate text? Join us to explore how combining Databricks' Mosaic AI Agent Framework with the Model Context Protocol (MCP) unlocks powerful tool-calling capabilities. We'll show you how MCP provides a standardized way for AI agents to interact with external tools, data and APIs, solving the headache of fragmented integration approaches. Learn to build agents that can retrieve both structured and unstructured data, execute custom code and tackle real enterprise challenges. Key takeaways: Implementing MCP-enabled tool-calling in your AI agents Prototyping in AI Playground and exporting for deployment Integrating Unity Catalog functions as agent tools Ensuring governance and security for enterprise deployments Whether you're building customer service bots or data analysis assistants, you'll leave with practical know-how to create powerful, governed AI agents.

Driving Secure AI Innovation with Obsidian Security, Databricks, and PointGuard AI

As enterprises adopt AI and Large Language Models (LLMs), securing and governing these models - and the data used to train them - is essential. In this session, learn how Databricks Partner PointGuard AI helps organizations implement the Databricks AI Security Framework to manage AI-specific risks, ensuring security, compliance, and governance across the entire AI lifecycle. Then, discover how Obsidian Security provides a robust approach to AI security, enabling organizations to confidently scale AI applications.

Extending the Lakehouse: Power Interoperable Compute With Unity Catalog Open APIs

The lakehouse is built for storage flexibility, but what about compute? In this session, we’ll explore how Unity Catalog enables you to connect and govern multiple compute engines across your data ecosystem. With open APIs and support for the Iceberg REST Catalog, UC lets you extend access to engines like Trino, DuckDB, and Flink while maintaining centralized security, lineage, and interoperability. We will show how you can get started today working with engines like Apache Spark and Starburst to read and write to UC managed tables with some exciting demos. Learn how to bring flexibility to your compute layer—without compromising control.

Harnessing Databricks Asset Bundles: Transforming Pipeline Management at Scale at Stack Overflow

Discover how Stack Overflow optimized its data engineering workflows using Databricks Asset Bundles (DABs) for scalable and efficient pipeline deployments. This session explores the structured pipeline architecture, emphasizing code reusability, modular design and bundle variables to ensure clarity and data isolation across projects. Learn how the data team leverages enterprise infrastructure to streamline deployment across multiple environments. Key topics include DRY-principled modular design, essential DAB features for automation and data security strategies using Unity Catalog. Designed for data engineers and teams managing multi-project workflows, this talk offers actionable insights on optimizing pipelines with Databricks evolving toolset.

How HMS Federation Powered Nationwide’s Seamless and Efficient Unity Catalog Migration

This talk takes you through the Nationwide Security and Infrastructure data team's journey of migrating from HMS to UC. Discover how HMS federation simplified our transition to UC, allowing for an incremental migration that minimized disruption to data consumers while optimizing our data layout. We’ll share the key technical decisions, challenges faced and lessons learned along the way. The migration process wasn’t without its hurdles, so we’ll walk you through our detailed, step-by-step approach covering planning, execution and validation. We will also showcase the benefits realized, such as improved data governance, more efficient data access and enhanced operational performance. Join us to gain practical insights into executing complex data migrations with a focus on security, flexibility and long-term scalability.

Schiphol Group’s Transformation to Unity Catalog

Discover how Europe’s third-busiest airport, Schiphol Group, is elevating its data operations by transitioning from a standard Databricks setup to the advanced capabilities of Unity Catalog. In this session, we will share the motivations, obstacles and strategic decisions behind executing a seamless migration in a large-scale environment — one that spans hundreds of workspaces and demands continuous availability. Gain insights into planning and governance, learn how to safeguard data integrity and maintain operational flow, and understand the process of integrating Unity Catalog’s enhanced security and governance features. Attendees will leave with practical lessons from our hands-on experience, proven methods for similar migrations, and a clear perspective on the benefits this transition offers for complex, rapidly evolving organizations.

Securing Capital Markets: AI-Powered Risk Management for Resilience

In capital markets, mitigating risk is critical to protecting the firm’s reputation, assets, and clients. This session highlights how firms use technology to enhance risk management, ensure compliance and safeguard operations from emerging threats. Learn how advanced analytics and machine learning models are helping firms detect anomalies, prevent fraud, and manage regulatory complexities with greater precision. Hear from industry leaders who have successfully implemented proactive risk strategies that balance security with operational efficiency. Key Takeaways: Techniques for identifying risks early using AI-powered anomaly detection. Best practices for achieving compliance across complex regulatory environments. Insights into building resilient operations that protect assets without compromising growth potential. Don’t miss this session to discover how data intelligence is transforming risk management in capital markets—helping firms secure their future while driving success!"