In this technical deep dive, we will detail how customers implemented data mesh on Databricks and how standardizing on delta format enabled delta-to-delta share to non-Databricks consumers.
- Current state of the IT landscape
- Data silos (problems with organizations not having connected data in the ecosystem)
- A look back on why we moved away from data warehouses and choose cloud in the first place
- What caused the data chaos in the cloud (instrumentation and too much stitching together) ~ periodic table list of services of the cloud
- How to strike the balance between autonomy and centralization
- Why Databricks Unity Catalog puts you in the right path to implementing data mesh strategy
- What are the process and features that enable and end-to-end Implementation of a data strategy
- How customers were able to successfully implement the data mesh on out of the box Unity Catalog and delta sharing without overwhelming their IT tool stack
- Use cases
- Delta-to-delta data sharing
- Delta-to-others data sharing
- How do you navigate when data today is available across regions, across clouds, on-prem and external systems
- Change data feed to share only “data that has changed”
- Data stewardship
- Why ABAC is important
- How file based access policies and governance play an important role
- Future state and its pitfalls
- Egress costs
- Data compliances
Talk by: Surya Turaga and Thomas Roach
Connect with us: Website: https://databricks.com Twitter: https://twitter.com/databricks LinkedIn: https://www.linkedin.com/company/databricks Instagram: https://www.instagram.com/databricksinc Facebook: https://www.facebook.com/databricksinc