talk-data.com talk-data.com

Topic

Looker

bi data_exploration analytics

143

tagged

Activity Trend

14 peak/qtr
2020-Q1 2026-Q1

Activities

143 activities · Newest first

The AI landscape is evolving at breakneck speed, with new capabilities emerging quarterly that redefine what's possible. For professionals across industries, this creates a constant need to reassess workflows and skills. How do you stay relevant when the technology keeps leapfrogging itself? What happens to traditional roles when AI can increasingly handle complex tasks that once required specialized expertise? With product-market fit becoming a moving target and new positions like forward-deployed engineers emerging, understanding how to navigate this shifting terrain is crucial. The winners won't just be those who adopt AI—but those who can continuously adapt as it evolves. Tomasz Tunguz is a General Partner at Theory Ventures, a $235m early-stage venture capital firm. He blogs at tomtunguz.com & co-authored Winning with Data. He has worked or works with Looker, Kustomer, Monte Carlo, Dremio, Omni, Hex, Spot, Arbitrum, Sui & many others. He was previously the product manager for Google's social media monetization team, including the Google-MySpace partnership, and managed the launches of AdSense into six new markets in Europe and Asia. Before Google, Tunguz developed systems for the Department of Homeland Security at Appian Corporation. In the episode, Richie and Tom explore the rapid investment in AI, the evolution of AI models like Gemini 3, the role of AI agents in productivity, the shifting job market, the impact of AI on customer success and product management, and much more. Links Mentioned in the Show: Theory VenturesConnect with TomTom’s BlogGavin Baker on MediumAI-Native Course: Intro to AI for WorkRelated Episode: Data & AI Trends in 2024, with Tom Tunguz, General Partner at Theory VenturesRewatch RADAR AI  New to DataCamp? Learn on the go using the DataCamp mobile appEmpower your business with world-class data and AI skills with DataCamp for business

On today's Promoted Episode of Experiencing Data, I’m talking with Lucas Thelosen, CEO of Gravity and creator of Orion, an AI analyst transforming how data teams work. Lucas was head of PS for Looker, and eventually became Head of Product for Google’s Data and AI Cloud prior to starting his own data product company. We dig into how his team built Orion, the challenge of keeping AI accurate and trustworthy when doing analytical work, and how they’re thinking about the balance of human control with automation when their product acts as a force multiplier for human analysts.

In addition to talking about the product, we also talk about how Gravity arrived at specific enough use cases for this technology that a market would be willing to pay for, and how they’re thinking about pricing in today’s more “outcomes-based” environment. 

Incidentally, one thing I didn’t know when I first agreed to consider having Gravity and Lucas on my show was that Lucas has been a long-time proponent of data product management and operating with a product mindset. In this episode, he shares the “ah-hah” moment where things clicked for him around building data products in this manner. Lucas shares how pivotal this moment was for him, and how it helped accelerate his career from Looker to Google and now Gravity.

If you’re leading a data team, you’re a forward-thinking CDO, or you’re interested in commercializing your own analytics/AI product, my chat with Lucas should inspire you!  

Highlights/ Skip to:

Lucas’s breakthrough came when he embraced a data product management mindset (02:43) How Lucas thinks about Gravity as being the instrumentalists in an orchestra, conducted by the user (4:31) Finding product-market fit by solving for a common analytics pain point (8:11) Analytics product and dashboard adoption challenges: why dashboards die and thinking of analytics as changing the business gradually (22:25) What outcome-based pricing means for AI and analytics (32:08) The challenge of defining guardrails and ethics for AI-based analytics products [just in case somebody wants to “fudge the numbers”] (46:03) Lucas’ closing thoughts about what AI is unlocking for analysts and how to position your career for the future  (48:35)

Special Bonus for DPLC Community Members Are you a member of the Data Product Leadership Community? After our chat, I invited Lucas to come give a talk about his journey of moving from “data” to “product” and adopting a producty mindset for analytics and AI work. He was more than happy to oblige. Watch for this in late 2025/early 2026 on our monthly webinar and group discussion calendar.

Note: today’s episode is one of my rare Promoted Episodes. Please help support the show by visiting Gravity’s links below:

Quotes from Today’s Episode “The whole point of data and analytics is to help the business evolve. When your reports make people ask new questions, that’s a win. If the conversations today sound different than they did three months ago, it means you’ve done your job, you’ve helped move the business forward.” — Lucas 

“Accuracy is everything. The moment you lose trust, the business, the use case, it's all over. Earning that trust back takes a long time, so we made accuracy our number one design pillar from day one.” — Lucas 

“Language models have changed the game in terms of scale. Suddenly, we’re facing all these new kinds of problems, not just in AI, but in the old-school software sense too. Things like privacy, scalability, and figuring out who’s responsible.” — Brian

“Most people building analytics products have never been analysts, and that’s a huge disadvantage. If data doesn’t drive action, you’ve missed the mark. That’s why so many dashboards die quickly.” — Lucas

“Re: collecting feedback so you know if your UX is good: I generally agree that qualitative feedback is the best place to start, not analytics [on your analytics!] Especially in UX, analytics measure usage aspects of the product, not the subject human experience. Experience is a collection of feelings and perceptions about how something went.” — Brian

Links

Gravity: https://www.bygravity.com LinkedIn: https://www.linkedin.com/in/thelosen/ Email Lucas and team: [email protected]

Path to Stellar Business Performance Analysis : A Design and Implementation Handbook

Business performance analysis is central to any business, as it helps to make or mend products, services, and processes. This book provides several blueprints for setting up business performance analytics (BPA) shops, from process layout for performance measures to tracking the underlying metrics of them using website tools such as Google Analytics and Looker Studio. Delivering satisfying user experiences in the context of overarching business objectives is key to delivering elevated business performance. This book transcends the topic of tracking user behaviors in websites from generic to specific KPI scenario-based tracking using Google Analytics/Google Tag Manager. Business Performance Analysis stands out by helping you create fit-for-purpose and coherent performance analysis blueprints by integrating performance measure creation and website analytics of BPA together. What You Will Learn Design a Business Performance Analysis function Analyze performance metrics with website analytics tools Identify business performance metrics for common product scenarios Who This Book is For Senior leaders, product managers, product owners, UX and web analytics professionals

Migrating your BI platform sounds daunting — especially when you’re staring down hundreds of dashboards, years of legacy content, and a hard deadline. At Game Lounge, we made the leap from Looker to Omni, migrating over 800 dashboards in under three months — without disrupting the business.

In this session, we’ll walk through the practical playbook behind our successful migration: how we scoped the project, prioritised what mattered most, and moved quickly without compromising quality. We’ll share how we phased the migration, reduced dashboard sprawl by over 80%, and leaned on Omni’s AI-assisted features to accelerate setup and streamline cleanup.

We’ll also touch on how we kept quality high post-migration — introducing initiatives like dashboard verification to ensure lasting data trust. And we’ll share what happened next, with over 140 employees now using data to inform decisions every day.

Whether you’re planninga migration or trying to make sense of legacy BI sprawl, this session offers honest lessons, practical frameworks, and time-saving tips to help your team move fast and build smarter.

Product managers for BI platforms have it easy. They "just" need to have the dev team build a tool that gives all types of users access to all of the data they should be allowed to see in a way that is quick, simple, and clear while preventing them from pulling data that can be misinterpreted. Of course, there are a lot of different types of users—from the C-level executive who wants ready access to high-level metrics all the way to the analyst or data scientist who wants to drop into a SQL flow state to everyone in between. And sometimes the tool needs to provide structured dashboards, while at other times it needs to be a mechanism for ad hoc analysis. Maybe the product manager's job is actually…impossible? Past Looker CAO and current Omni CEO Colin Zima joined this episode for a lively discussion on the subject! For complete show notes, including links to items mentioned in this episode and a transcript of the show, visit the show page.

In today’s dynamic data environments, tables and schemas are constantly evolving and keeping semantic layers up to date has become a critical operational challenge. Manual updates don’t scale, and delays can quickly lead to broken dashboards, failed pipelines, and lost trust. We’ll show how to harness Apache Airflow 3 and its new event-driven scheduling capabilities to automate the entire lifecycle: detecting table and schema changes in real time, parsing and interpreting those changes, and shifting left the updating of semantic models across dbt, Looker, or custom metadata layers. AI agents will add intelligence and automation that rationalize schema diffs, assess impact of changes, and propose targeted updates to semantic layers reducing manual work and minimizing the risk of errors. We’ll dive into strategies for efficient change detection, safe incremental updates, and orchestrating workflows where humans collaborate with AI agents to validate and deploy changes. By the end of the session, you’ll understand how to build resilient, self-healing semantic layers that minimize downtime, reduce manual intervention, and scale effortlessly across fast-changing data environments.

The modern data stack has transformed how organizations work with data, but are our BI tools keeping pace with these changes? As data schemas become increasingly fluid and analysis needs range from quick explorations to production-grade reporting, traditional approaches are being challenged. How can we create analytics experiences that accommodate both casual spreadsheet users and technical data modelers? With semantic layers becoming crucial for AI integration and data governance growing in importance, what skills do today's BI professionals need to master? Finding the balance between flexibility and governance is perhaps the greatest challenge facing data teams today. Colin Zima is the Co-Founder and CEO of Omni, a business intelligence platform focused on making data more accessible and useful for teams of all sizes. Prior to Omni, he was Chief Analytics Officer and VP of Product at Looker, where he helped shape the product and data strategy leading up to its acquisition by Google for $2.6 billion. Colin’s background spans roles in data science, analytics, and product leadership, including positions at Google, HotelTonight, and as founder of the restaurant analytics startup PrimaTable. He holds a degree in Operations Research and Financial Engineering from Princeton University and began his career as a Structured Credit Analyst at UBS. In the episode, Richie and Colin explore the evolution of BI tools, the challenges of integrating casual and rigorous data analysis, the role of semantic layers, and the impact of AI on business intelligence. They discuss the importance of understanding business needs, creating user-focused dashboards, and the future of data products, and much more. Links Mentioned in the Show: OmniConnect with ColinSkill Track: Design in Power BIRelated Episode: Self-Service Business Intelligence with Sameer Al-Sakran, CEO at MetabaseRegister for RADAR AI - June 26 New to DataCamp? Learn on the go using the DataCamp mobile appEmpower your business with world-class data and AI skills with DataCamp for business

Scaling Trust in BI: How Bolt Manages Thousands of Metrics Across Databricks, dbt, and Looker

Managing metrics across teams can feel like everyone’s speaking a different language, which often leads to loss of trust in numbers. Based on a real-world use case, we’ll show you how to establish a governed source of truth for metrics that works at scale and builds a solid foundation for AI integration. You’ll explore how Bolt.eu’s data team governs consistent metrics for different data users and leverages Euno’s automations to navigate the overlap between Looker and dbt. We’ll cover best practices for deciding where your metrics belong and how to optimize engineering and maintenance workflows across Databricks, dbt and Looker. For curious analytics engineers, we’ll dive into thinking in dimensions & measures vs. tables & columns and determining when pre-aggregations make sense. The goal is to help you contribute to a self-serve experience with consistent metric definitions, so business teams and AI agents can access the right data at the right time without endless back-and-forth.

In this episode of Hub & Spoken, Jason Foster speaks with Colin Zima, CEO and Co-founder of Omni, a modern business intelligence platform that combines the best of governance and usability. With a background spanning roles at Looker and Google, and two decades as both a data user and builder, Colin brings a unique perspective on the evolution of BI and the real role of AI in shaping its future. They explore why business intelligence remains critical for aligning organisations, how AI is raising the bar for access and self-service, and why semantics and business logic are more important than ever. The conversation challenges the notion that AI will replace dashboards, and instead focuses on how it can enhance accessibility, support different user needs, and empower data teams to work more efficiently. This episode is essential listening for business and data leaders thinking about the future of BI, the practical use of AI, and the role data teams play in delivering real value at speed. Tune in to hear how modern BI is evolving, and what leaders need to know to stay ahead. ****    Cynozure is a leading data, analytics and AI company that helps organisations to reach their data potential. It works with clients on data and AI strategy, data management, data architecture and engineering, analytics and AI, data culture and literacy, and data leadership. The company was named one of The Sunday Times' fastest-growing private companies in both 2022 and 2023 and recognised as The Best Place to Work in Data by DataIQ in 2023 and 2024. Cynozure is a certified B Corporation. 

Looker’s AI-first analytics experience, with a conversational interface, enables all users in your organization to leverage trusted data and make better decisions. Discover how you can lay the foundations to deliver best-in-class conversational AI experiences. Join us, along with a cohort of your peers, to participate in discussions around foundational strategies for conversational AI and share existing use cases and experiences.

This session explores how Looker and Google Workspace work together to bring data-driven decisions right into your everyday workflow. Learn how to leverage Looker spreadsheet-based business intelligence (BI) capabilities with Connected Sheets, integrate Looker with other Workspace apps, automate slide generation, and use the advanced features of Looker Studio Pro. And view a live demo of Looker BI agents in Google Chat. Ideal for anyone looking to streamline their data workflow and enhance collaboration within their Workspace environment.

Looker is evolving! Join us for a deep dive into the reimagined Looker experience, where data exploration, analysis, and reporting are more intuitive, collaborative, and powerful than ever before. In this session, we’ll explore how Studio in Looker combines the flexibility of Looker Studio visualizations and reporting with the governance and trustworthiness of the Looker semantic layer.

Supercharge your applications and drive revenue growth with Looker Embedded. This session reveals how to seamlessly integrate powerful data experiences and AI-driven insights directly into your products. Discover how to increase customer engagement, unlock new revenue streams, and gain a competitive advantage. Learn how to reduce development time and costs by leveraging the robust Looker platform. Join us to learn how Looker can help you transform your applications and deliver actionable intelligence to your customers.

Unlock the power of natural language with Looker Agents! This technical deep dive will walk you through an agentic architecture in Looker Conversational Analytics and showcase how the Chief Product Officer of Zeotap is helping Zeotap customers “chat with their data” within the Zeotap platform using the new Conversational Analytics API. Learn how to build custom data agents, answer questions in Workspace, and create analytics applications with the power of conversational AI.

Unlock the full potential of your data with the power of AI. This session explores how Google Cloud’s latest advancements in AI are transforming business intelligence, empowering you to gain deeper insights, make faster decisions, and drive innovation. We’ll dive into trusted insights with BigQuery and AI, the power of the Looker semantic layer, and how Google Cloud’s AI-powered business intelligence (BI) solutions can help you transform your data into actionable intelligence and drive business success.

This scalable, AI-powered data quality solution requires minimal coding and maintenance. It learns about your data products to improve data quality across multiple dimensions. The framework uses BigQuery, BQML, Dataform, and Looker to deliver a comprehensive and automated Data Quality solution with a unified user experience for both data platform owners and business users.

session
by Ani Jain (Google Cloud) , Vijay Venugopal (Google Cloud) , Anssi Rusi (SuperMetrics) , Marc Wollnik (Google Cloud)

Imagine a world where interacting with your data is as simple as chatting with a friend. Gemini in Looker makes this a reality, bringing the power of Google's most advanced AI models and agents directly to your BI workflows. This session explores how Gemini's genAI capabilities and Looker Agents are being integrated into Looker, empowering users to analyze data, build dashboards, and generate insights using natural language. Discover how this powerful combination unlocks new levels of productivity for BI professionals and business users alike.

Small teams often struggle to unlock the full potential of their data due to limited resources and a lack of specialized expertise. In this session, we’ll show you how BigQuery and Looker make advanced analytics accessible and easy for teams of any size. Learn how to seamlessly integrate, analyze, and visualize your data to drive data-driven decisions and achieve your business goals.

Discover how Google’s interconnected ecosystem of Google Cloud platform and specialty solutions can address the needs and challenges of resource-constrained IT teams. We’ll delve into practical use cases and demonstrate how Google Cloud’s specialized business intelligence platform (Looker) and security solutions (Google Security Operations, Mandiant) can help your business improve efficiency and reduce costs while improving your security posture.

This talk will demonstrate how the SAP user community can use Looker/Explore Assistant Chatbot to explore data insights into SAP ERP data stored on Google Cloud's BigQuery using natural language prompts. We will discuss the challenge of accessing and analyzing SAP data - ETL, Complex Data Model, introduction to Generative AI and Large Language Models (LLMs), and Looker Explore Assistant and Chatbot This presentation will illustrate how SAP users can leverage Looker and Explore Assistant Chatbot to gain insights into their SAP ERP data residing on Google Cloud's BigQuery, using natural language prompts. We will address common challenges in accessing and analyzing SAP data, such as ETL processes and complex data models. Additionally, we will provide an introduction to Generative AI and Large Language Models (LLMs), as well as an overview of Looker Explore Assistant and Chatbot's capabilities.