talk-data.com talk-data.com

Topic

Analytics

data_analysis insights metrics

4552

tagged

Activity Trend

398 peak/qtr
2020-Q1 2026-Q1

Activities

4552 activities · Newest first

Data governance often begins with Data Defense — centralized stewardship focused on compliance and regulatory needs, built on passive metadata, manual documentation, and heavy SME reliance. While effective for audits, this top-down approach offers limited business value. 

Data Governance has moved to a Data Offense model to drive Data Monetization of Critical Data Assets in focusing on analytics and data science outcomes for improved decision-making, customer and associate experiences. This involves the integration of data quality and observability with a shift-left based on tangible impact to business outcomes, improved governance maturity, and accelerated resolution of business-impacting issues.

The next iteration is to move to the next phase of Data Stewardship in advancing to AI-Augmented and Autonomous Stewardship — embedding SME knowledge into automated workflows, managing critical assets autonomously, and delivering actionable context through proactive, shift-left observability, producer–consumer contracts, and SLAs that are built into data product development.

You’ve rolled out self-service analytics. Everyone’s answering their own questions. Job done, right? Not quite.

Together with ThoughtSpot, Huel shares what comes next when your organisation has fully embraced self-service analytics. From redefining the role of the data team, to unlocking a culture of empowerment for business users to go from reactive insights to proactive decisions, discover how Huel is going beyond dashboards and into a new era of analytics. If you’ve nailed self-service, or even if you haven't, this is your roadmap to what’s next.

For years, data engineering was a story of predictable pipelines: move data from point A to point B. But AI just hit the reset button on our entire field. Now, we're all staring into the void, wondering what's next. While the fundamentals haven't changed, data remains challenging in the traditional areas of data governance, data management, and data modeling, which still present challenges. Everything else is up for grabs.

This talk will cut through the noise and explore the future of data engineering in an AI-driven world. We'll examine how team structures will evolve, why agentic workflows and real-time systems are becoming non-negotiable, and how our focus must shift from building dashboards and analytics to architecting for automated action. The reset button has been pushed. It's time for us to invent the future of our industry.

Businesses spend countless hours wrangling data: extracting information from messy PDFs, building dashboards that nobody uses, and attempting to extract insights that simply don’t exist. Surely there’s a better way?

In this session, Vishal Soni and Owen Coyle will show how AI and Alteryx can work together to completely transform how you handle data. Starting with one of the toughest challenges: extracting structured information from unstructured PDFs. Instead of complex regex, manual OCR, or hours of cleanup, you’ll see how LLMs inside Alteryx can instantly convert complex documents into clean, tabular data that’s ready for analysis.

Once this data is processed: Alteryx Auto Insights can be leveraged, which produces AI-powered analysis of your data, and jumps straight to the “why” behind the numbers. You’ll quickly see how Auto Insights surfaces the most important trends, patterns, anomalies and actionable insights. All this, while generating personalized, presentation-ready reports to drive action.

Whether you’re new to Alteryx or already an experienced user, you’ll leave this session with a clear understanding of how AI is changing analytics – turning hours of manual work into instant, actional insight – and how Alteryx is that the forefront of this change.

Ten years ago, I began advocating for **DataOps**, a framework designed to improve collaboration, efficiency, and agility in data management. The industry was still grappling with fragmented workflows, slow delivery cycles, and a disconnect between data teams and business needs. Fast forward to today, and the landscape has transformed, but have we truly embraced the future of leveraging data at scale? This session will reflect on the evolution of DataOps, examining what’s changed, what challenges persist, and where we're headed next.

**Key Takeaways:**

✅ The biggest wins and ongoing struggles in implementing DataOps over the last decade. 

✅ Practical strategies for improving automation, governance, and data quality in modern workflows. 

✅ How emerging trends like AI-driven automation and real-time analytics are reshaping the way we approach data management. 

✅ Actionable insights on how data teams can stay agile and align better with business objectives. 

**Why Attend?**

If you're a data professional, architect, or leader striving for operational excellence, this talk will equip you with the knowledge to future-proof your data strategies.

What happens when cutting-edge data meets disruption? In this half-hour session, leading Women in Data® voices from some of the biggest names in the insurance sector will explore how innovative data and analytics are reshaping finance for a more resilient future.  

From climate risk and regulatory shifts to customer behaviour and operational efficiency, this session will unlock how insurers are using their data to adapt and stay ahead of the curve in our rapidly changing world. 

Powered by: Women in Data®

Business challenges that were once sporadic are now persistent and widespread—impacting everyone across the organization, from business users and analysts to data engineers and scientists.

To keep pace, BI platforms have steadily evolved, embracing technologies that empower every user to tackle growing data complexity with confidence.

Now, with sophisticated Gen AI and Agentic AI capabilities built into these platforms, we’re stepping into a new era of analytics—one that redefines what data democratization means for modern businesses.

Join us for an exclusive session where we’ll explore how the latest innovations in Gen AI are reshaping the BI landscape and unlocking powerful, actionable insights for every user.

In this session, you’ll learn:

- What defines a truly Gen AI-powered BI platform

- How businesses can empower every user with cutting-edge Gen AI

- How Agentic AI is shaping BI

- Live demos showcasing Gen AI and Agentic AI capabilities in BI

- Discover how a Gen BI platform can drive smarter decisions, boost productivity, and deliver transformative business outcomes.

AI is changing the game across industries, but one of its most powerful and urgent applications is in combating climate change. From optimizing renewable energy grids to improving carbon capture and making supply chains more sustainable, AI is playing a crucial role in driving real-world impact.

In this talk, Russell Dalgleish—entrepreneur, investor, and a leading voice in Scotland’s sustainability and innovation scene—dips into how AI-driven solutions like predictive analytics, and intelligent automation are reshaping the path to net zero. He’ll share lessons from Scotland’s thriving Greentech sector, offering practical insights on how businesses can harness AI to meet sustainability goals while staying ethical and responsible.

If you’re a business leader, policymaker, or innovator looking to turn AI into a force for good, this session is for you.

AI is changing every aspect of how we do our work and analytics engineering is no different. However, using AI to achieve 10x productivity improvements requires a totally new approach to how we do analytics. In this session, we will demystify how to use context engineering, and prompting techniques based on practical experience we've had at rolling our Paradime to some of the most innovative and AI-forward startups, scale-ups and enterprises.

In this joint session, GoodData and Purple will share a field guide to turning analytics into revenue. 

Together, we’ll show you how governed, context-rich intelligence becomes paid products that customers adopt. Using Purple’s guest Wi-Fi data product as an example, we’ll unpack what’s sold, how it’s priced, and how GoodData powers it in a step-by-step runbook you can use today. 

We’ll close with what’s next: operationalizing and monetizing AI Assistants and Agents so you can charge for outcomes, not charts.

Everyone talks about AI, but few are truly winning with it. In today’s fast-paced analytics landscape, dashboards and chatbot features aren’t enough. Winning in the age of AI takes more: a unified, governed platform that brings data, AI, and people together to drive real decisions at scale.

In this session, discover how Pyramid Analytics helps organizations cut through silos, hype, and complexity to deliver smarter, faster, and more trusted decisions across the enterprise.

AI agents are the next essential enterprise capability, but they bring new complexities in control, safety, and scale. In this interactive session, see how you can design, deploy, and govern agents alongside your existing analytics and models.

Analytics engineers are at a crossroads. Back in 2018, dbt paved the way for for this new kind of data professional, people who had technical ability and could understand business context. But here's the thing: AI is automating traditional tasks like pipeline building and dashboard creation. So then what happens to analytics engineers? They don't disappear - they evolve.

The same skills that made analytics engineers valuable also make them perfect for a new role I'm calling 'Analytics Intelligence Engineers.' Instead of writing SQL, they're writing the context that makes AI actually useful for business users.

In this talk, I'll show you what this evolution looks like day-to-day. We'll explore building semantic layers, crafting AI context, and measuring AI performance - all through real examples using Lightdash. You'll see how the work shifts from data plumbing to data intelligence, and walk away with practical tips for making AI tools more effective in your organization. Whether you're an analytics engineer wondering about your future or a leader planning your data strategy, this session will help you understand where the field is heading and how to get there.

The modern enterprise is increasingly defined by the need for open, governed, and intelligent data access. This session explores how Apache Iceberg, Dremio, and the Model Context Protocol (MCP) come together to enable the Agentic Lakehouse. A data platform that is interoperable, high-performing, and AI-ready.

We’ll begin with Apache Iceberg, which provides the foundation for data interoperability across teams and organisations, ensuring shared datasets can be reliably accessed and evolved. From there, we’ll highlight how Dremio extends Iceberg with turnkey governance, management, and performance acceleration, unifying your lakehouse with databases and warehouses under one platform. Finally, we’ll introduce MCP and showcase how innovations like the Dremio MCP server enable natural-language analytics on your data. 

With the power of Dremio’s built-in semantic layer, AI agents and humans alike can ask complex business questions in plain language and receive accurate, governed answers.

Join us to learn how to unlock the next generation of data interaction with the Agentic Lakehouse.

The growth of connected data has made graph databases essential, yet organisations often face a dilemma: choosing between an operational graph for real-time queries or an analytical engine for large-scale processing. This division leads to data silos and complex ETL pipelines, hindering the seamless integration of real-time insights with deep analytics and the ability to ground AI models in factual, enterprise-specific knowledge. Google Cloud aims to solve this with a unified "Graph Fabric," introducing Spanner Graph, which extends Spanner with native support for the ISO standard Graph Query Language (GQL). This session will cover how Google Cloud has developed a Unified Graph Solution with BigQuery and Spanner graphs to serve a full spectrum of graph needs from operational to analytical.

Data leaders today face a familiar challenge: complex pipelines, duplicated systems, and spiraling infrastructure costs. Standardizing around Kafka for real-time and Iceberg for large-scale analytics has gone some way towards addressing this but still requires separate stacks, leaving teams to stitch them together at high expense and risk.

This talk will explore how Kafka and Iceberg together form a new foundation for data infrastructure. One that unifies streaming and analytics into a single, cost-efficient layer. By standardizing on these open technologies, organizations can reduce data duplication, simplify governance, and unlock both instant insights and long-term value from the same platform.

You will come away with a clear understanding of why this convergence is reshaping the industry, how it lowers operational risk, and advantages it offers for building durable, future-proof data capabilities.

Is your analytics workflow stuck in fragmented chaos? AlphaSights, the global leader in expert knowledge on demand, used to juggle queries, scripts, spreadsheets, and dashboards across different tools just to get one analysis out the door. Manual updates slowed their teams, stakeholders waited too long for insights, and opportunities slipped through the cracks.With Hex, AlphaSights built a fully integrated Research Hub that unifies data queries, API calls, ML-powered enrichment, and reporting — all in one place. They eliminated manual work, automated updates, and empowered business teams to act faster on opportunities.The result: faster reaction times, broader coverage, and measurable commercial impact. Join this session to see how AlphaSights turned fragmented workflows into a seamless, automated pipeline — and learn how your team can build faster, smarter insights too.

Data platform migrations and modernisations take 18 months. 80% fail or are late. Teams waste 35% of their time rediscovering tribal knowledge. 'Vibe coding' causes 2000% compute spikes. Generic AI tools generate code without understanding your business logic, creating production disasters.

Fortune 500 companies are escaping this $280B crisis. One retail giant cut their SAP-to-Fabric migration from 18 to 6 months. Another achieved 85% error reduction in procurement analytics.

This session reveals how global enterprises capture organisational knowledge permanently, automate the entire data product lifecycle, and deliver production-ready analytics 3x faster. Learn the AI-first approach that transforms months of manual work into weeks of automated delivery.