talk-data.com talk-data.com

Topic

AI/ML

Artificial Intelligence/Machine Learning

data_science algorithms predictive_analytics

9014

tagged

Activity Trend

1532 peak/qtr
2020-Q1 2026-Q1

Activities

9014 activities · Newest first

Sponsored by: West Monroe | Disruptive Forces: LLMs and the New Age of Data Engineering

Seismic shift Large Language Models are unleashing on data engineering, challenging traditional workflows. LLMs obliterate inefficiencies and redefine productivity. AI powerhouses automate complex tasks like documentation, code translation, and data model development with unprecedented speed and precision. Integrating LLMs into tools promises to reduce offshore dependency, fostering agile onshore innovation. Harnessing LLMs' full potential involves challenges, requiring deep dives into domain-specific data and strategic business alignment. Session will addresses deploying LLMs effectively, overcoming data management hurdles, and fostering collaboration between engineers and stakeholders. Join us to explore a future where LLMs redefine possibilities, inviting you to embrace AI-driven innovation and position your organization as a leader in data engineering.

AI in Motion: Build a Roadmap for Impact in Just 30 Minutes

This high-velocity workshop is designed for data and AI leaders seeking to rapidly develop a comprehensive AI strategy tailored to their organization's needs. In just 30 minutes, participants will engage in a focused, interactive session that delivers actionable insights and a strategic framework for AI implementation. Key components of the workshop include: Rapid assessment: Quickly evaluate your organization's AI readiness and potential impact areas Strategic alignment: Align AI initiatives with core business objectives and value creation opportunities Resource optimization: Identify critical resources, skills and technologies required for successful AI adoption Risk mitigation: Address key challenges and ethical considerations in AI deployment Priority areas: Where to focus first to get the best start By the end of this intensive session, you will have the foundation of a robust AI strategy and guidance on roadmap execution.

Building and Scaling Production AI Systems With Mosaic AI

Ready to go beyond the basics of Mosaic AI? This session will walk you through how to architect and scale production-grade AI systems on the Databricks Data Intelligence Platform. We’ll cover practical techniques for building end-to-end AI pipelines — from processing structured and unstructured data to applying Mosaic AI tools and functions for model development, deployment and monitoring. You’ll learn how to integrate experiment tracking with MLflow, apply performance tuning and use built-in frameworks to manage the full AI lifecycle. By the end, you’ll be equipped to design, deploy and maintain AI systems that deliver measurable outcomes at enterprise scale.

Building Tool-Calling Agents With Databricks Agent Framework and MCP

Want to create AI agents that can do more than just generate text? Join us to explore how combining Databricks' Mosaic AI Agent Framework with the Model Context Protocol (MCP) unlocks powerful tool-calling capabilities. We'll show you how MCP provides a standardized way for AI agents to interact with external tools, data and APIs, solving the headache of fragmented integration approaches. Learn to build agents that can retrieve both structured and unstructured data, execute custom code and tackle real enterprise challenges. Key takeaways: Implementing MCP-enabled tool-calling in your AI agents Prototyping in AI Playground and exporting for deployment Integrating Unity Catalog functions as agent tools Ensuring governance and security for enterprise deployments Whether you're building customer service bots or data analysis assistants, you'll leave with practical know-how to create powerful, governed AI agents.

Driving Secure AI Innovation with Obsidian Security, Databricks, and PointGuard AI

As enterprises adopt AI and Large Language Models (LLMs), securing and governing these models - and the data used to train them - is essential. In this session, learn how Databricks Partner PointGuard AI helps organizations implement the Databricks AI Security Framework to manage AI-specific risks, ensuring security, compliance, and governance across the entire AI lifecycle. Then, discover how Obsidian Security provides a robust approach to AI security, enabling organizations to confidently scale AI applications.

Evolving Data Insights With Privacy at Mastercard

Mastercard is a global technology company whose role is anchored in trust. It supports 3.4 billion cards and over 143 billion transactions annually. To address customers’ increasing data volume and complex privacy needs, Mastercard has developed a novel service atop Databricks’ Clean Rooms and broader Data Intelligence Platform. This service combines several Databricks components with Mastercard’s IP, providing an evolved method for data-driven insights and value-added services while ensuring a unique standalone turnkey service. The result is a secure environment where multiple parties can collaborate on sensitive data without directly accessing each other’s information. After this session, attendees will understand how Mastercard used its expertise in privacy-enhancing technologies to create collaboration tools powered by Databricks’ Clean Rooms, AI/BI, Apps, Unity Catalog, Workflows and DatabricksIQ — as well as how to take advantage of this new privacy-enhancing service directly.

Looking for a practical workshop on building an AI Agent on Databricks? Well, we have just the thing for you.This hands-on workshop takes you through the process of creating intelligent agents that can reason their way to useful outcomes. You'll start by building your own toolkit of SQL and Python functions that give your agent practical capabilities. Then we'll explore how to select the right foundation model for your needs, connect your custom tools, and watch as your agent tackles complex challenges through visible reasoning paths.The workshop doesn't just stop at building—you'll dive into evaluation techniques using evaluation datasets to identify where your agent shines and where it needs improvement. After implementing and measuring your changes, we'll explore deployment strategies, including a feedback collection interface that enables continuous improvement and governance mechanisms to ensure responsible AI usage in production environments.

Most organizations run complex cloud data architectures that silo applications, users and data. Join this interactive hands-on workshop to learn how Databricks SQL allows you to operate a multi-cloud lakehouse architecture that delivers data warehouse performance at data lake economics — with up to 12x better price/performance than traditional cloud data warehouses. Here’s what we’ll cover: How Databricks SQL fits in the Data Intelligence Platform, enabling you to operate a multicloud lakehouse architecture that delivers data warehouse performance at data lake economics How to manage and monitor compute resources, data access and users across your lakehouse infrastructure How to query directly on your data lake using your tools of choice or the built-in SQL editor and visualizations How to use AI to increase productivity when querying, completing code or building dashboards Ask your questions during this hands-on lab, and the Databricks experts will guide you.

How the Texas Rangers Use a Unified Data Platform to Drive World Class Baseball Analytics

Don't miss this session where we demonstrate how the Texas Rangers baseball team is staying one step ahead of the competition by going back to the basics. After implementing a modern data strategy with Databricks and winnng the 2023 World Series the rest of the league quickly followed suit. Now more than ever, data and AI are a central pillar of every baseball team's strategy driving profound insights into player performance and game dynamics. With a 'fundamentals win games' back to the basics focus, join us as we explain our commmitment to world-class data quality, engineering, and MLOPS by taking full advantage of the Databricks Data Intelligence Platform. From system tables to federated querying, find out how the Rangers use every tool at their disposal to stay one step ahead in the hyper competitive world of baseball.

Intuit's Privacy-Safe Lending Marketplace: Leveraging Databricks Clean Rooms

Intuit leverages Databricks Clean Rooms to create a secure, privacy-safe lending marketplace, enabling small business lending partners to perform analytics and deploy ML/AI workflows on sensitive data assets. This session explores the technical foundations of building isolated clean rooms across multiple partners and cloud providers, differentiating Databricks Clean Rooms from market alternatives. We'll demonstrate our automated approach to clean room lifecycle management using APIs, covering creation, collaborator onboarding, data asset sharing, workflow orchestration and activity auditing. The integration with Unity Catalog for managing clean room inputs and outputs will also be discussed. Attendees will gain insights into harnessing collaborative ML/AI potential, support various languages and workloads, and enable complex computations without compromising sensitive information in Clean Rooms.

MLOps That Ships: Accelerating AI Deployment at Vizient

Deploying AI models efficiently and consistently is a challenge many organizations face. This session will explore how Vizient built a standardized MLOps stack using Databricks and Azure DevOps to streamline model development, deployment and monitoring. Attendees will gain insights into how Databricks Asset Bundles were leveraged to create reproducible, scalable pipelines and how Infrastructure-as-Code principles accelerated onboarding for new AI projects. The talk will cover: End-to-end MLOps stack setup, ensuring efficiency and governance CI/CD pipeline architecture, automating model versioning and deployment Standardizing AI model repositories, reducing development and deployment time Lessons learned, including challenges and best practices By the end of this session, participants will have a roadmap for implementing a scalable, reusable MLOps framework that enhances operational efficiency across AI initiatives.

Scaling Data Engineering Pipelines: Preparing Credit Card Transactions Data for Machine Learning

We discuss two real-world use cases in big data engineering, focusing on constructing stable pipelines and managing storage at a petabyte scale. The first use case highlights the implementation of Delta Lake to optimize data pipelines, resulting in an 80% reduction in query time and a 70% reduction in storage space. The second use case demonstrates the effectiveness of the Workflows ‘ForEach’ operator in executing compute-intensive pipelines across multiple clusters, significantly reducing processing time from months to days. This approach involves a reusable design pattern that isolates notebooks into units of work, enabling data scientists to independently test and develop.

Scaling Success: How Banks are Unlocking Growth With Data and AI

Growth in banking isn’t just about keeping pace—it’s about setting the pace. This session explores how leading banks leverage Databricks’ Data Intelligence Platform to uncover new revenue opportunities, deepen customer relationships, and expand market reach. Hear from industry leaders who have transformed their growth strategies by harnessing the power of advanced analytics and machine learning. Learn how personalized customer experiences, predictive insights and unified data platforms are driving innovation and helping banks scale faster than ever. Key takeaways: Proven strategies for identifying untapped growth opportunities using data-driven approaches Real-world examples of banks creating personalized customer journeys that boost retention and loyalty Tools and techniques to accelerate innovation while maintaining operational efficiency Join us in discovering how data intelligence is redefining growth in banking and thriving throughout uncertainty.

Sponsored by: Accenture & Avanade | How data strategy powers mission-critical work at the Gates Foundation

There’s never been a more critical time to ensure data and analytics foundations can deliver the value and efficiency needed to accelerate and scale AI. What are the most difficult challenges that organizations face with data transformation, and what technologies, processes and decisions that overcome these barriers to success? Join this session featuring executives from the Gates Foundation, the nonprofit leading change in communities around the globe, and Avanade, the joint venture between Accenture and Microsoft, in a discussion about impactful data strategy. Learn about the Gates Foundation’s approach to its enterprise data platform to ensure trusted insights at the speed of today’s business. And we’ll share lessons learned from Avanade helping organizations around the globe build with Databricks and seize the AI opportunity.

Sponsored by: KPMG | Enhancing Regulatory Compliance through Data Quality and Traceability

In highly regulated industries like financial services, maintaining data quality is an ongoing challenge. Reactive measures often fail to prevent regulatory penalties, causing inaccuracies in reporting and inefficiencies due to poor data visibility. Regulators closely examine the origins and accuracy of reporting calculations to ensure compliance. A robust system for data quality and lineage is crucial. Organizations are utilizing Databricks to proactively improve data quality through rules-based and AI/ML-driven methods. This fosters complete visibility across IT, data management, and business operations, facilitating rapid issue resolution and continuous data quality enhancement. The outcome is quicker, more accurate, transparent financial reporting. We will detail a framework for data observability and offer practical examples of implementing quality checks throughout the data lifecycle, specifically focusing on creating data pipelines for regulatory reporting,

Sponsored by: LTIMindtree | 4 Strategies to Maximize SAP Data Value with Databricks and AI

As enterprises strive to become more data-driven, SAP continues to be central to their operational backbone. However, traditional SAP ecosystems often limit the potential of AI and advanced analytics due to fragmented architectures and legacy tools. In this session, we explore four strategic options for unlocking greater value from SAP data by integrating with Databricks and cloud-native platforms. Whether you're on ECC, S4HANA, or transitioning from BW, learn how to modernize your data landscape, enable real-time insights, and power AI/ML at scale. Discover how SAP Business Data Cloud and SAP Databricks can help you build a unified, future-ready data and analytics ecosystem—without compromising on scalability, flexibility, or cost-efficiency.

Sponsored by: Monte Carlo | Cleared for Takeoff: How American Airlines Builds Data Trust

American Airlines, one of the largest airlines in the world, processes a tremendous amount of data every single minute. With a data estate of this scale, accountability for the data goes beyond the data team; the business organization has to be equally invested in championing the quality, reliability, and governance of data. In this session, Andrew Machen, Senior Manager, Data Engineering at American Airlines will share how his team maximizes resources to deliver reliable data at scale. He'll also outline his strategy for aligning business leadership with an investment in data reliability, and how leveraging Monte Carlo's data + AI observability platform enabled them to reduce time spent resolving data reliability issues from 10 weeks to 2 days, saving millions of dollars and driving valuable trust in the data.

The AI Regulation Dilemma: Spur Innovation, or Guardrails? — Where Are We and the Impact of Trump 2

The Trump 2 AI agenda prioritizes US AI leadership by opposing AI regulation on bias and frontier AI risks, favoring innovation and AI expansion. With comprehensive federal AI regulation unlikely, states are advancing AI laws addressing bias, harmful content, transparency, frontier model risk and other risks. Meanwhile, the EU AI Act effectively imposes global obligations. The emerging patchwork of state rules will burden US companies more than would a unified federal approach, seemingly undermining White House deregulatory goals. So, ironically, the Trump team AI agenda may accelerate disparate state-level regulation and impede AI innovation. US companies therefore face a fragmented landscape similar to privacy regulation where the EU AI Act — in the role of GDPR — has set the stage, and the states are asserting themselves with various incremental requirements. Other recent developments covered will include the finalization of the EU GPAI Code of Practice, certain newly enacted state laws, and a quick overview of AI regulation outside the U.S. and EU.

The Full Stack of Innovation: Building Data and AI Products With Databricks Apps

In this deep-dive technical session, Ivan Trusov (Sr. SSA @ Databricks) and Giran Moodley (SA @ Databricks) — will explore the full-stack development of Databricks Apps, covering everything from frameworks to deployment. We’ll walk through essential topics, including: Frameworks & tooling — Pythonic (Dash, Streamlit, Gradio) vs. JS + Python stack Development lifecycle — Debugging, issue resolution and best practices Testing — Unit, integration and load testing strategies CI/CD & deployment — Automating with Databricks Asset Bundles Monitoring & observability — OpenTelemetry, metrics collection and analysis Expect a highly practical session with several live demos, showcasing the development loop, testing workflows and CI/CD automation. Whether you’re building internal tools or AI-powered products, this talk will equip you with the knowledge to ship robust, scalable Databricks Apps.

Use External Models in Databricks: Connecting to Azure, AWS, Google Cloud, Anthropic and More

In this session you will learn how to leverage a wide set of GenAI models in Databricks, including external connections to cloud vendors and other model providers. We will cover establishing connection to externally served models, via Mosaic AI Gateway. This will showcase connection to Azure, AWS & Google Cloud models, as well as model vendors like Anthropic, Cohere, AI21 Labs and more. You will also discover best practices on model comparison, governance and cost control on those model deployments.