talk-data.com talk-data.com

Topic

Data Quality

data_management data_cleansing data_validation

16

tagged

Activity Trend

82 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Secrets of Data Analytics Leaders ×

LLMs are hugely popular with data engineers because they boost productivity. But companies must adapt their data governance programs to control risks related to data quality, privacy, intellectual property, fai-Datarness, and explainability. Published at: https://www.eckerson.com/articles/should-ai-bots-build-your-data-pipelines-part-ii-risks-and-governance-approaches-for-data-engineers-to-use-large-language-models

A robust data workflow testing strategy helps ensure the accuracy and reliability of data processed within a pipeline. Use this checklist to meet your organization’s data quality requirements according to the dimensions of accuracy, completeness, conformity, consistency, integrity, precision, timeliness, and uniqueness. Published at: https://www.eckerson.com/articles/developing-a-robust-data-quality-strategy-for-your-data-pipeline-workflows

Data mesh is a hot topic in the data world, generating conversations about the benefits and drawbacks of its decentralized approach. Concerns about an explosion of data silos and inconsistent data quality are justified. But to those who feel a bit like Chicken Little, maybe the sky is not falling. Published at: https://www.eckerson.com/articles/data-mesh-the-sky-is-not-falling

Traditional techniques for managing data quality break at scale. Machine learning algorithms can automate aspects of the data quality workload, ensuring that the data the business users consume is reliable. This article profiles three tools and approaches that use ML to automate data quality. Published at: https://www.eckerson.com/articles/three-data-quality-automation-tools-you-should-consider

Active metadata is not a type of metadata, it’s a way of using metadata to power systems. Active metadata is a critical feature of modern data architectures such as data fabric and data mesh. It makes things work such as data access management, data classification, and data quality management. Published at: https://www.eckerson.com/articles/active-metadata-the-critical-factor-for-mastering-modern-data-management

Data observability provides intelligence about data quality and data pipeline performance, contributing to the disciplines of DataOps and FinOps. Vendors such as DataKitchen, DataOps.live, Informatica, and Unravel offer solutions to help enterprises address these overlapping disciplines. Published at: https://www.eckerson.com/articles/the-blending-disciplines-of-data-observability-dataops-and-finops

It’s tempting to dismiss observability as another overused buzzword. But this emerging discipline offers substantive methods for enterprises to monitor and optimize business metrics, IT operations, data pipelines, machine learning models, and data quality. Published at: https://www.eckerson.com/articles/the-five-shades-of-observability-business-operations-pipelines-models-and-data-quality

COVID, inflation, broken supply chains, and not-so-distant war make this a turbulent time for the modern consumer. During times like these, families tend to their nests, which leads to lots of home-improvement projects…which means lots of painting.

Today we explore the case study of a Fortune 500 producer of the paints and stains that coat many households, consumer products, and even mechanical vehicles. While business expands, this company needs to carefully align the records that track hundreds of suppliers, thousands of storefronts, and millions of customers.

Business expansion and complex supply chains make it particularly important—and challenging—for enterprises such as this paint producer, which we’ll call Bright Colors, to accurately describe the entities that make up their business. They need to be governed, validated data to describe entities such as their products, locations, and customers. Master data management, also known as MDM, streamlines operations and assists data governance by reconciling disparate data records into golden records and ideally a single source of truth.

We’re excited to share our conversation with an industry expert that helps Bright Colors and other Fortune 2000 enterprises navigate turbulent times with effective strategies for MDM and data governance.

Dave Wilkinson is chief technology officer with D3Clarity, a global strategy and implementation services firm that seeks to ensure digital certainty, security, and trust. D3Clarity is a partner of Semarchy, whose Intelligent Data Hub software helps enterprises govern and manage master data, reference data, data quality, enrichment, and workflows. Semarchy sponsored this podcast.

Fast-casual restaurants offer a fascinating microcosm of the turbulent forces confronting enterprises today—and the pivotal role that data plays in helping them maintain competitive advantage. COVID prompted customers to order their Chipotle burritos, Shake Shack milkshakes, and Bruegger’s Bagels for home delivery, and this trend continues in 2022. Supply-chain disruptions, meanwhile, force fast-casual restaurants to make some fast pivots between suppliers in order to keep their shelves stocked. And the market continues to grow as these companies win customers, add locations, and expand delivery partnerships.

These three industry trends—home delivery, supply-chain disruptions, and market expansion—all depend on governed, accurate data to describe entities such as orders, ingredients, and locations. Data quality and master data management therefore play a more pivotal role than ever in the success of fast-casual restaurants. Master data management, also known as MDM, streamlines operations and assists data governance by reconciling disparate data records into a golden record and source of truth. If you’re looking for an ideal case study for how MDM drives enterprise reinvention, agility, and growth, this is it.

We’re excited to talk with an industry expert that helps fast-casual restaurants handle these turbulent forces with effective strategies for managing data and especially master data. Matt Zingariello is Vice President of Data Strategy Services with Keyrus, a global consultancy that helps enterprises use data assets to optimize their digital strategies and customer experience. Matt leads a team that provides industry-specific advisory and implementation services to help enterprises address challenges such as data governance and MDM.

Keyrus is a partner of Semarchy, whose Intelligent Data Hub software helps enterprises govern and manage master data, reference data, data quality, enrichment, and workflows. Semarchy sponsored this podcast.

In our podcast, we'll define data quality and MDM as part of data governance. We’ll explore why enterprises need data quality and MDM, and how they can craft effective data quality and MDM strategies, with a focus on fast-casual restaurants as a case study.

It’s hard to find a data discipline today that is under more pressure than data governance. One on side, the supply of data is exploding. As enterprises transform their business to compete in the 2020s, they digitize myriad events and interactions, which creates mountains of data that they need to control. On the other side, demand for data is exploding. Business owners at all levels of the enterprise need to inform their decisions and drive their operations with data.

Under these pressures, data governance teams must ensure business owners access and consume the right, high-quality data. This requires master data management—the reconciliation of disparate data records into a golden record and source of truth—which assists data governance at many modern enterprises.

In this episode, our host Kevin Petrie, VP of Research at Eckerson Group talks with our guests Felicia Perez, Managing Director, Information as a Product Program at National Student Clearinghouse, and Patrick O'Halloran, enterprise data scientist as they define what data quality and MDM are, why you need them, and how best to achieve effective data quality and MDM.

Why is Data Quality still an issue after all these years? To get an answer to the prevalent question, Wayne Eckerson and Jason Beard engage in a dynamic exchange of questions which lead us to the root cause of data quality and data governance problems. Using examples from his past projects, Jason shows the value of business process mapping and how it exposes the hidden problems which go undetected under the standard IT lens.

In his most recent role as Vice President of Process & Data Management at Wiley, a book publisher, he was responsible for master data setup and governance, process optimization, business continuity planning, and change management for new and emerging business models. Jason has led business intelligence, data governance, master data management, Process Improvement, Business Transformation, and ERP projects in a variety of industries, including Scientific and Trade publishing, Educational Technology, Consumer Goods, Banking, Investments, and Insurance.

podcast_episode
by Wayne Eckerson (Eckerson Group) , Carl Gerber (Various financial services and manufacturing firms (currently independent consultant and Eckerson Group partner))

In this podcast, Carl Gerber and Wayne Eckerson discuss Gerber’s top five data governance best practices: Motivation, Assessment, Data Assets Catalog, CxO Alliance, and Data Quality.

Gerber is a long-time chief data officer and data leader at several large, diverse financial services and manufacturing firms, who is now an independent consultant and an Eckerson Group partner.

He helps large organizations develop data strategies, modernize analytics, and establish enterprise data governance programs that ensure data quality, operational efficiency, regulatory compliance, and business outcomes. He also mentors and coaches Chief Data Officers and fills that role on an interim basis.