Conventional data governance conflicts with today’s world of self-service analytics and agile projects. Published at: https://www.eckerson.com/articles/modern-data-governance-problems
talk-data.com
Topic
Agile/Scrum
6
tagged
Activity Trend
Top Events
A zone-based data refinery creates an agile, adaptable data environment that supports new and unanticipated business requirements quickly. It turns a monolithic data warehouse into a flexible data environment that gracefully adapts to new and unanticipated business requirements while maximizing reuse and standards. Published at: https://www.eckerson.com/articles/how-zone-based-data-processing-turns-your-monolithic-data-warehouse-into-a-flexible-modern-data-architecture
Just-in-time design is the practice of designing working software in small increments that support a business-defined need or story. Just-in-time design, as well as just-in-time testing, is an integral part of the agile software methodology. In fact, you can’t really do agile without just-in-time design.
To help us understand the nuances of just-in-time design, we invited Aaron Fuller, a long-time data architect and member of Eckerson Group’s consulting network. Across an 11-year career as the enterprise data architect for an insurance company, he modeled data, created technical designs for a broad range of systems, established governance and stewardship, and led the establishment of their enterprise data warehousing, business intelligence, and enterprise architecture programs. As principal consultant and owner of Superior Data Strategies since 2010, he leads a team of highly skilled data professionals who are uniquely capable of planning and executing agile data projects.
Data virtualization has been around for decades and has always been controversial. In the 1990s, it was called virtual data warehousing or VDW-- or as some skeptics liked to say, "voodoo and witchcraft”. It’s also been known as query federation and more recently, data services. The idea is that business users don't need to know the location of the data; they merely need to log into the data service and all data appears as if it’s local to their server, modeled in a fashion that makes sense to them.
Andrew Sohn is the Global Head of Data and Analytics at Crawford & Company, a $1B+ service provider to the insurance and risk management industry, where he designed and leads its data and digital transformation strategy and program. With more than 25 years in the industry, Andrew has managed a broad range of infrastructure and application technologies. He’s a strong advocate of data virtualization technology and believes it is an integral part of a modern, agile data ecosystem.
In this episode, Wayne Eckerson and Shakeeb Ahkter dive into DataOps. They discuss what DataOps is, the goals and principles of DataOps, and reasons to adopt a DataOps strategy. Shakeeb also reveals the benefits gained from DataOps and what tools he uses. He is the Director of Enterprise Data Warehouse at Northwestern Medicine and is responsible for direction and oversight of data management, data engineering, and analytics.
Data pipelines become chaotic with pressures of agile, democratization, self-service, and organizational “pockets” of analytics. From enterprise BI to self-service analysis, data pipeline management should ensure analysis results are traceable, reproducible, and of production strength. Robust data pipelines rely on eight critical components.
Originally published at https://www.eckerson.com/articles/the-complexities-of-modern-data-pipelines