How do you transform a data pipeline from sluggish 10-hour batch processing into a real-time powerhouse that delivers insights in just 10 minutes? This was the challenge we tackled at one of France's largest manufacturing companies, where data integration and analytics were mission-critical for supply chain optimization. Power BI dashboards needed to refresh every 15 minutes. Our team struggled with legacy Azure Data Factory batch pipelines. These outdated processes couldn’t keep up, delaying insights and generating up to three daily incident tickets. We identified Lakeflow Declarative Pipelines and Databricks SQL as the game-changing solution to modernize our workflow, implement quality checks, and reduce processing times.In this session, we’ll dive into the key factors behind our success: Pipeline modernization with Lakeflow Declarative Pipelines: improving scalability Data quality enforcement: clean, reliable datasets Seamless BI integration: Using Databricks SQL to power fast, efficient queries in Power BI
talk-data.com
Speaker
Sidney Cardoso
1
talks
As a Data & Solutions Architect, I excel in designing and implementing robust, scalable data platforms based on modern cloud-native principles. My expertise spans architecting end-to-end data ecosystems, from ingestion to analytics, leveraging cutting-edge technologies like Databricks, Azure, and GCP. I orchestrate complex data workloads with Kubernetes, ensure real-time data flow via Kafka, and lead application modernization. Passionate about transforming challenges, I deliver innovative, efficient data solutions that harness data for strategic advantage and digital transformation. My focus is on guiding organizations to build future-proof, resilient architectures aligned with evolving business needs.
Bio from: Data + AI Summit 2025
Filter by Event / Source
Talks & appearances
1 activities · Newest first