This blog defines the governance requirements that streaming data pipelines must meet to make artificial intelligence/machine learning (AI/ML) initiatives successful. Published at: https://www.eckerson.com/articles/streaming-data-governance-three-must-have-requirements-to-support-ai-ml-innovation
talk-data.com
Topic
Data Governance
22
tagged
Activity Trend
Top Events
With the increasing adoption of Generative AI, learn how data governance will add value to and benefit from Generative AI. Published at: https://www.eckerson.com/articles/data-governance-in-the-era-of-generative-ai
Conventional data governance conflicts with today’s world of self-service analytics and agile projects. Published at: https://www.eckerson.com/articles/modern-data-governance-problems
Data leaders must prepare their teams to deliver the timely, accurate, and trustworthy data that GenAI initiatives need to ensure they deliver results. They can do so by modernizing their environments, extending data governance programs, and fostering collaboration with data science teams. Published at: https://www.eckerson.com/articles/the-data-leader-s-guide-to-generative-ai-part-i-models-applications-and-pipelines
Our industry’s breathless hype about generative AI tends to overlook the stubborn challenge of data governance. Data catalogs address this challenge by evaluating and controlling the accuracy, explainability, privacy, IP friendliness, and fairness of GenAI inputs. Published at: https://www.eckerson.com/articles/generative-ai-needs-vigilant-data-cataloging-and-governance
LLMs are hugely popular with data engineers because they boost productivity. But companies must adapt their data governance programs to control risks related to data quality, privacy, intellectual property, fai-Datarness, and explainability. Published at: https://www.eckerson.com/articles/should-ai-bots-build-your-data-pipelines-part-ii-risks-and-governance-approaches-for-data-engineers-to-use-large-language-models
At IAPP Summit, privacy and data governance leaders expressed the importance of a collaborative operating model. Published at: https://www.eckerson.com/articles/the-convergence-of-data-governance-and-privacy-takeaways-from-the-global-privacy-summit
We enter 2023 in a haze of uncertainty. Enterprises must rationalize analytics projects, shift to lower-risk use cases, and control cloud costs. They also must measure the ROI of analytics projects and use data governance to reduce business risk. Published at: https://www.eckerson.com/articles/analyzing-a-downturn-five-principles-for-data-analytics-in-2023
Data Stewardship Experience strategies (personal growth, community, societal contribution, and disruptive innovation) can meet several cognitive, social, and psychological needs, and motivate professionals to become productive data stewards. It also removes the stigma of data governance as a rigid and bureaucratic gatekeeping discipline. Published at: https://www.eckerson.com/articles/improving-the-data-stewardship-experience-dsx-productive-motivational-strategies-for-data-governance
Consider key trends and challenges as you design an effective organizational architecture for data governance while generating value with pervasive analytics. Published at: https://www.eckerson.com/articles/organizational-architecture-can-make-or-break-your-data-governance-program
As we share data, we create data webs. If we allow copies of our data to proliferate throughout these webs, we reduce the value of the data and create data governance challenges. The solution is new, ownership-centric approaches to data sharing that don’t rely on traditional copy-based integration. Published at: https://www.eckerson.com/articles/zero-copy-approaches-to-data-sharing
COVID, inflation, broken supply chains, and not-so-distant war make this a turbulent time for the modern consumer. During times like these, families tend to their nests, which leads to lots of home-improvement projects…which means lots of painting.
Today we explore the case study of a Fortune 500 producer of the paints and stains that coat many households, consumer products, and even mechanical vehicles. While business expands, this company needs to carefully align the records that track hundreds of suppliers, thousands of storefronts, and millions of customers.
Business expansion and complex supply chains make it particularly important—and challenging—for enterprises such as this paint producer, which we’ll call Bright Colors, to accurately describe the entities that make up their business. They need to be governed, validated data to describe entities such as their products, locations, and customers. Master data management, also known as MDM, streamlines operations and assists data governance by reconciling disparate data records into golden records and ideally a single source of truth.
We’re excited to share our conversation with an industry expert that helps Bright Colors and other Fortune 2000 enterprises navigate turbulent times with effective strategies for MDM and data governance.
Dave Wilkinson is chief technology officer with D3Clarity, a global strategy and implementation services firm that seeks to ensure digital certainty, security, and trust. D3Clarity is a partner of Semarchy, whose Intelligent Data Hub software helps enterprises govern and manage master data, reference data, data quality, enrichment, and workflows. Semarchy sponsored this podcast.
Fast-casual restaurants offer a fascinating microcosm of the turbulent forces confronting enterprises today—and the pivotal role that data plays in helping them maintain competitive advantage. COVID prompted customers to order their Chipotle burritos, Shake Shack milkshakes, and Bruegger’s Bagels for home delivery, and this trend continues in 2022. Supply-chain disruptions, meanwhile, force fast-casual restaurants to make some fast pivots between suppliers in order to keep their shelves stocked. And the market continues to grow as these companies win customers, add locations, and expand delivery partnerships.
These three industry trends—home delivery, supply-chain disruptions, and market expansion—all depend on governed, accurate data to describe entities such as orders, ingredients, and locations. Data quality and master data management therefore play a more pivotal role than ever in the success of fast-casual restaurants. Master data management, also known as MDM, streamlines operations and assists data governance by reconciling disparate data records into a golden record and source of truth. If you’re looking for an ideal case study for how MDM drives enterprise reinvention, agility, and growth, this is it.
We’re excited to talk with an industry expert that helps fast-casual restaurants handle these turbulent forces with effective strategies for managing data and especially master data. Matt Zingariello is Vice President of Data Strategy Services with Keyrus, a global consultancy that helps enterprises use data assets to optimize their digital strategies and customer experience. Matt leads a team that provides industry-specific advisory and implementation services to help enterprises address challenges such as data governance and MDM.
Keyrus is a partner of Semarchy, whose Intelligent Data Hub software helps enterprises govern and manage master data, reference data, data quality, enrichment, and workflows. Semarchy sponsored this podcast.
In our podcast, we'll define data quality and MDM as part of data governance. We’ll explore why enterprises need data quality and MDM, and how they can craft effective data quality and MDM strategies, with a focus on fast-casual restaurants as a case study.
It’s hard to find a data discipline today that is under more pressure than data governance. One on side, the supply of data is exploding. As enterprises transform their business to compete in the 2020s, they digitize myriad events and interactions, which creates mountains of data that they need to control. On the other side, demand for data is exploding. Business owners at all levels of the enterprise need to inform their decisions and drive their operations with data.
Under these pressures, data governance teams must ensure business owners access and consume the right, high-quality data. This requires master data management—the reconciliation of disparate data records into a golden record and source of truth—which assists data governance at many modern enterprises.
In this episode, our host Kevin Petrie, VP of Research at Eckerson Group talks with our guests Felicia Perez, Managing Director, Information as a Product Program at National Student Clearinghouse, and Patrick O'Halloran, enterprise data scientist as they define what data quality and MDM are, why you need them, and how best to achieve effective data quality and MDM.
Chief data officers (CDOs) first appeared in enterprise organizations after the Sarbanes Oxley Act became law in the United States in 2002 to improve corporate governance controls. CDOs started with a trickle, but have since become a flood, now populating more than two-thirds of large enterprises, according to a recent survey by NewVantage Partners.
To explore this dynamic role in detail, we invited Joe Dossantos, newly minted CDO for the data and analytics software vendor Qlik. Joe is responsible for data governance, internal data delivery, and self-service enablement. He also evangelizes data and analytics best practices to Qlik customers.
Prior to joining Qlik, Joe led TD Bank’s data strategy, and built and ran the Big Data Consulting Practice for EMC Corporation's Professional Services Organization.
Master Data Management is no shiny object. But like many traditional IT practices, MDM is being severely tested – and rendered all the more strategic – by digitalization and rising data volumes.
Originally published at https://www.eckerson.com/articles/five-master-data-management-best-practices-for-enterprises
One of the hardest parts of running a data analytics program inside a large organization is governing data and reports. It’s simply too easy for the definition of core data elements and metrics to get out of sync and reports to contain conflicting information.
Angie Davis has straddled both the business and IT worlds for more than 20 years. She served as a business analyst in several organizations before switching to the information technology side of the business where she ran analytics teams, first at JD Irving for six years and more recently at Brookfield Renewable where she is an IT director. Angie has a degree in mathematics and electrical engineering from Dalhousie University in Halifax, Nova Scotia.
Why is Data Quality still an issue after all these years? To get an answer to the prevalent question, Wayne Eckerson and Jason Beard engage in a dynamic exchange of questions which lead us to the root cause of data quality and data governance problems. Using examples from his past projects, Jason shows the value of business process mapping and how it exposes the hidden problems which go undetected under the standard IT lens.
In his most recent role as Vice President of Process & Data Management at Wiley, a book publisher, he was responsible for master data setup and governance, process optimization, business continuity planning, and change management for new and emerging business models. Jason has led business intelligence, data governance, master data management, Process Improvement, Business Transformation, and ERP projects in a variety of industries, including Scientific and Trade publishing, Educational Technology, Consumer Goods, Banking, Investments, and Insurance.
In this Episode, Wayne Eckerson asks Charles Reeves about his organization’s Internet of Things and Big Data strategy. Reeves is senior manager of BI and analytics at Graphics Packaging International, a leader in the packaging industry with hundreds of worldwide customers. He has 25 years of professional experience in IT management including nine years in reporting, analytics, and data governance.
In this podcast, Carl Gerber and Wayne Eckerson discuss Gerber’s top five data governance best practices: Motivation, Assessment, Data Assets Catalog, CxO Alliance, and Data Quality.
Gerber is a long-time chief data officer and data leader at several large, diverse financial services and manufacturing firms, who is now an independent consultant and an Eckerson Group partner.
He helps large organizations develop data strategies, modernize analytics, and establish enterprise data governance programs that ensure data quality, operational efficiency, regulatory compliance, and business outcomes. He also mentors and coaches Chief Data Officers and fills that role on an interim basis.