talk-data.com talk-data.com

Topic

Cyber Security

cybersecurity information_security data_security privacy

2078

tagged

Activity Trend

297 peak/qtr
2020-Q1 2026-Q1

Activities

2078 activities · Newest first

Enabling Sleep Science Research With Databricks and Delta Sharing

Leveraging Databricks as a platform, we facilitate the sharing of anonymized datasets across various Databricks workspaces and accounts, spanning multiple cloud environments such as AWS, Azure, and Google Cloud. This capability, powered by Delta Sharing, extends both within and outside Sleep Number, enabling accelerated insights while ensuring compliance with data security and privacy standards. In this session, we will showcase our architecture and implementation strategy for data sharing, highlighting the use of Databricks’ Unity Catalog and Delta Sharing, along with integration with platforms like Jira, Jenkins, and Terraform to streamline project management and system orchestration.

Mastering Data Security and Compliance: CoorsTek's Journey With Databricks Unity Catalog

Ensuring data security & meeting compliance requirements are critical priorities for businesses operating in regulated industries, where the stakes are high and the standards are stringent. We will showcase how CoorsTek, a global leader in technical ceramics MFG, partnered with Databricks to leverage the power of UC for addressing regulatory challenges while achieving significant operational efficiency gains. We'll dive into the migration journey, highlighting the adoption of key features such as RBAC, comprehensive data lineage tracking and robust auditing capabilities. Attendees will gain practical insights into the strategies and tools used to manage sensitive data, ensure compliance with industry standards and optimize cloud data architectures. Additionally, we’ll share real-world lessons learned, best practices for integrating compliance into a modern data ecosystem and actionable takeaways for leveraging Databricks as a catalyst for secure and compliant data innovation.

Smart Vehicles, Secure Data: Recreating Vehicle Environments for Privacy-Preserving Machine Learning

As connected vehicles generate vast amounts of personal and sensitive data, ensuring privacy and security in machine learning (ML) processes is essential. This session explores how Trusted Execution Environments (TEEs) and Azure Confidential Computing can enable privacy-preserving ML in cloud environments. We’ll present a method to recreate a vehicle environment in the cloud, where sensitive data remains private throughout model training, inference and deployment. Attendees will learn how Mercedes-Benz R&D North America builds secure, privacy-respecting personalized systems for the next generation of connected vehicles.

Swimming at Our Own Lakehouse: How Databricks Uses Databricks

This session is repeated. Peek behind the curtain to learn how Databricks processes hundreds of petabytes of data across every region and cloud where we operate. Learn how Databricks leverages Data and AI to scale and optimize every aspect of the company. From facilities and legal to sales and marketing and of course product research and development. This session is a high-level tour inside Databricks to see how Data and AI enable us to be a better company. We will go into the architecture of things for how Databricks is used for internal use cases like business analytics and SIEM as well as customer-facing features like system tables and assistant. We will cover how data production of our data flow and how we maintain security and privacy while operating a large multi-cloud, multi-region environment.

Sponsored by: Immuta | Agentic Impact to Secure Data Provisioning

As AI, internal data marketplaces, and self-service access become more popular, data teams must rethink how they securely govern and provision data at scale. Success depends on provisioning data in a way that balances security, compliance, and innovation, and promotes data-driven decision making when decision makers are AI Agents. In this session, we'll discuss how you can:- Launch and manage effective and secure data provisioning- Secure your AI initiatives- Scale your Data Governors through Agentic AIJoin us to learn how to navigate the complexities of modern data environments, and start putting your data to work faster.

How Databricks Powers Real-Time Threat Detection at Barracuda XDR

As cybersecurity threats grow in volume and complexity, organizations must efficiently process security telemetry for best-in-class detection and mitigation. Barracuda’s XDR platform is redefining security operations by layering advanced detection methodologies over a broad range of supported technologies. Our vision is to deliver unparalleled protection through automation, machine learning and scalable detection frameworks, ensuring threats are identified and mitigated quickly. To achieve this, we have adopted Databricks as the foundation of our security analytics platform, providing greater control and flexibility while decoupling from traditional SIEM tools. By leveraging Lakeflow Declarative Pipelines, Spark Structured Streaming and detection-as-code CI/CD pipelines, we have built a real-time detection engine that enhances scalability, accuracy and cost efficiency. This session explores how Databricks is shaping the future of XDR through real-time analytics and cloud-native security.

How Nubank improves Governance, Security and User Experience with Unity Catalog

At Nubank, we successfully migrated to Unity Catalog, addressing the needs of our large-scale data environment with 3k active users, over 4k notebooks and jobs and 1.1 million tables, including sensitive PII data. Our primary objectives were to enhance data governance, security and user experience.Key points: Comprehensive data access monitoring and control implementation Enhanced security measures for handling PII and sensitive data Efficient migration of 4,000+ notebooks and jobs to the new system Improved cataloging and governance for 1.1 million tables Implementation of robust access controls and permissions model Optimized user experience and productivity through centralized data management This migration significantly improved our data governance capabilities, enhanced security measures and provided a more user-friendly experience for our large user base, ultimately leading to better control and utilization of our vast data resources.

Implementing GreenOps in Databricks: A Practical Guide for Regulated Environments

Join us on a technical journey into GreenOps at ABN AMRO Bank using Databricks system tables. We'll explore security, implementation challenges and best-practice verification, with practical examples and actionable reports. Discover how to optimize resource usage, ensure compliance and maintain agility. We'll discuss best practices, potential pitfalls and the nuanced 'it depends' scenarios, offering a comprehensive guide for intermediate to advanced practitioners.

Reimagining Data Governance and Access at Atlassian

Atlassian is rebuilding its central lakehouse from the ground up to deliver a more secure, flexible and scalable data environment. In this session, we’ll share how we leverage Unity Catalog for fine-grained governance and supplement it with Immuta for dynamic policy management, enabling row and column level security at scale. By shifting away from broad, monolithic access controls toward a modern, agile solution, we’re empowering teams to securely collaborate on sensitive data without sacrificing performance or usability. Join us for an inside look at our end-to-end policy architecture, from how data owners declare metadata and author policies to the seamless application of access rules across the platform. We’ll also discuss lessons learned on streamlining data governance, ensuring compliance, and improving user adoption. Whether you’re a data architect, engineer or leader, walk away with actionable strategies to simplify and strengthen your own governance and access practices.

Scaling Data Intelligence at NAB: Balancing Innovation with Enterprise-Grade Governance

In this session, discover how National Australia Bank (NAB) is reshaping its data and AI strategy by positioning data as a strategic enabler. Driven by a vision to unlock data like electricity—continuous and reliable—NAB has established a scalable foundation for data intelligence that balances agility with enterprise-grade control. We'll delve into the key architectural, security, and governance capabilities underpinning this transformation, including Unity Catalog, Serverless, Lakeflow and GenAI. The session will highlight NAB's adoption of Databricks Serverless, platform security controls like private link, and persona-based data access patterns. Attendees will walk away with practical insights into building secure, scalable, and cost-efficient data platforms that fuel innovation while meeting the demands of compliance in highly regulated environments.

Sponsored by: Accenture & Avanade | Enterprise Data Journey for The Standard Insurance Leveraging Databricks on Azure and AI Innovation

Modern insurers require agile, integrated data systems to harness AI. This framework for a global insurer uses Azure Databricks to unify legacy systems into a governed lakehouse medallion architecture (bronze/silver/gold layers), eliminating silos and enabling real-time analytics. The solution employs: Medallion architecture for incremental data quality improvement. Unity Catalog for centralized governance, row/column security, and audit compliance. Azure encryption/confidential computing for data mesh security. Automated ingestion/semantic/DevOps pipelines for scalability. By combining Databricks’ distributed infrastructure with Azure’s security, the insurer achieves regulatory compliance while enabling AI-driven innovation (e.g., underwriting, claims). The framework establishes a future-proof foundation for mergers/acquisitions (M&A) and cross-functional data products, balancing governance with agility.

Sponsored by: Astronomer | Unlocking the Future of Data Orchestration: Introducing Apache Airflow® 3

Airflow 3 is here, bringing a new era of flexibility, scalability, and security to data orchestration. This release makes building, running, and managing data pipelines easier than ever. In this session, we will cover the key benefits of Airflow 3, including: (1) Ease of Use: Airflow 3 rethinks the user experience—from an intuitive, upgraded UI to DAG Versioning and scheduler-integrated backfills that let teams manage pipelines more effectively than ever before (2) Stronger Security: By decoupling task execution from direct database connections, Airflow 3 enforces task isolation and minimal-privilege access. This meets stringent compliance standards while reducing the risk of unauthorized data exposure. (3) Ultimate Flexibility: Run tasks anywhere, anytime with remote execution and event-driven scheduling. Airflow 3 is designed for global, heterogeneous modern data environments with an architecture that facilitates edge and hybrid-cloud to GPU-based deployments.

Unleash the Power of Automated Data Governance: Classify, Tag and Protect Your Data — Effortlessly

Struggling to keep up with data governance at scale? Join us to explore how automated data classification, tag policies and ABAC streamline access control while enhancing security and compliance. Get an exclusive look at the new Governance Hub, built to give your teams deeper visibility into data usage, access patterns and metadata — all in one place. Whether you're managing thousands or millions of assets, discover how to classify, tag and protect your data estate effortlessly with the latest advancements in Unity Catalog.

Best Practices to Mitigate AI Security Risks

This session is repeated. AI is transforming industries, enhancing customer experiences and automating decisions. As organizations integrate AI into core operations, robust security is essential. The Databricks Security team collaborated with top cybersecurity researchers from OWASP, Gartner, NIST, HITRUST and Fortune 100 companies to evolve the Databricks AI Security Framework (DASF) to version 2.0. In this session, we’ll cover an AI security architecture using Unity Catalog, MLflow, egress controls, and AI gateway. Learn how security teams, AI practitioners and data engineers can secure AI applications on Databricks. Walk away with:• A reference architecture for securing AI applications• A worksheet with AI risks and controls mapped to industry standards like MITRE, OWASP, NIST and HITRUST• A DASF AI assistant tool to test your AI security

Revolutionizing Data Insights and the Buyer Experience at GM Financial with Cloud Data Modernization

Deloitte and GM (General Motors) Financial have collaborated to design and implement a cutting-edge cloud analytics platform, leveraging Databricks. In this session, we will explore how we overcame challenges including dispersed and limited data capabilities, high-cost hardware and outdated software, with a strategic and comprehensive approach. With the help of Deloitte and Databricks, we were able to develop a unified Customer360 view, integrate advanced AI-driven analytics, and establish robust data governance and cyber security measures. Attendees will gain valuable insights into the benefits realized, such as cost savings, enhanced customer experiences, and broad employee upskilling opportunities. Unlock the impact of cloud data modernization and advanced analytics in the automotive finance industry and beyond with Deloitte and Databricks.

Securing Data Collaboration: A Deep Dive Into Security, Frameworks, and Use Cases

This session will focus on the security aspects of Databricks Delta Sharing, Databricks Cleanrooms and Databricks Marketplace, providing an exploration of how these solutions enable secure and scalable data collaboration while prioritizing privacy. Highlights: Use cases — Understand how Delta Sharing facilitates governed, real-time data exchange across platforms and how Cleanrooms support multi-party analytics without exposing sensitive information Security internals — Dive into Delta Sharing's security frameworks Dynamic views — Learn about fine-grained security controls Privacy-first Cleanrooms — Explore how Cleanrooms enable secure analytics while maintaining strict data privacy standards Private exchanges — Explore the role of private exchanges using Databricks Marketplace in securely sharing custom datasets and AI models with specific partners or subsidiaries Network security & compliance — Review best practices for network configurations and compliance measures

Toyota, the world’s largest automaker, sought to accelerate time-to-data and empower business users with secure data collaboration for faster insights. Partnering with Cognizant, they established a Unified Data Lake, integrating SOX principles, Databricks Unity Catalog to ensure compliance and security. Additionally, they developed a Data Scanner solution to automatically detect non-sensitive data and accelerate data ingestion. Join this dynamic session to discover how they achieved it.

Best Practices for Building User-Facing AI Systems on Databricks

This session is repeated. Integrating AI agents into business systems requires tailored approaches for different maturity levels (crawl-walk-run) that balance scalability, accuracy and usability. This session addresses the critical challenge of making AI agents accessible to business users. We will explore four key integration methods: Databricks apps: The fastest way to build and run applications that leverage your data, with the full security and governance of Databricks Genie: Tool enabling non-technical users to gain data insights on Structured Data through natural language queries Chatbots: Combine real-time data retrieval with generative AI for contextual responses and process automation Batch inference: Scalable, asynchronous processing for large-scale AI tasks, optimizing efficiency and cost We'll compare these approaches, discussing their strengths, challenges and ideal use cases to help businesses select the most suitable integration strategy for their specific needs.

Breaking Silos: Enabling Databricks-Snowflake Interoperability With Iceberg and Unity Catalog

As data ecosystems grow more complex, organizations often struggle with siloed platforms and fragmented governance. In this session, we’ll explore how our team made Databricks the central hub for cross-platform interoperability, enabling seamless Snowflake integration through Unity Catalog and the Iceberg REST API. We’ll cover: Why interoperability matters and the business drivers behind our approach How Unity Catalog and Uniform simplify interoperability, allowing Databricks to expose an Iceberg REST API for external consumption Technical deep dive into data sharing, query performance, and access control across Databricks and Snowflake Lessons learned and best practices for building a multi-engine architecture while maintaining governance and efficiency By leveraging Uniform, Delta, and Iceberg, we created a flexible, vendor-agnostic architecture that bridges Databricks and Snowflake without compromising performance or security.

Building Responsible and Resilient AI: The Databricks AI Governance Framework

GenAI & machine learning are reshaping industries, driving innovation and redefining business strategies. As organizations embrace these technologies, they face significant challenges in managing AI initiatives effectively, such as balancing innovation with ethical integrity, operational resilience and regulatory compliance. This presentation introduces the Databricks AI Governance Framework (DAGF), a practical framework designed to empower organizations to navigate the complexities of AI. It provides strategies for building scalable, responsible AI programs that deliver measurable value, foster innovation and achieve long-term success. By examining the framework's five foundational pillars — AI organization, ethics, legal and regulatory compliance, transparency and interpretability, AI operations and infrastructure and AI security — this session highlights how AI governance aligns programs with the organization's strategic goals, mitigates risks and builds trust across stakeholders.