talk-data.com talk-data.com

Topic

KPI

Key Performance Indicator (KPI)

metrics performance_measurement business_analytics

4

tagged

Activity Trend

8 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Brian O’Neill ×

Todd Olson joins me to talk about making analytics worth paying for and relevant in the age of AI. The CEO of Pendo, an analytics SAAS company, Todd shares how the company evolved to support a wider audience by simplifying dashboards, removing user roadblocks, and leveraging AI to both generate and explain insights. We also talked about the roles of product management at Pendo. Todd views AI product management as a natural evolution for adaptable teams and explains how he thinks about hiring product roles in 2025. Todd also shares how he thinks about successful user adoption of his product around “time to value” and “stickiness” over vanity metrics like time spent. 

Highlights/ Skip to:

How Todd has addressed analytics apathy over the past decade at Pendo (1:17) Getting back to basics and not barraging people with more data and power (4:02) Pendo’s strategy for keeping the product experience simple without abandoning power users (6:44) Whether Todd is considering using an LLM (prompt-based) answer-driven experience with Pendo's UI (8:51) What Pendo looks for when hiring product managers right now, and why (14:58) How Pendo evaluates AI product managers, specifically (19:14) How Todd Olson views AI product management compared to traditional software product management (21:56) Todd’s concerns about the probabilistic nature of AI-generated answers in the product UX (27:51) What KPIs Todd uses to know whether Pendo is doing enough to reach its goals (32:49)   Why being able to tell what answers are best will become more important as choice increases (40:05)

Quotes from Today’s Episode

“Let’s go back to classic Geoffrey Moore Crossing the Chasm, you’re selling to early adopters. And what you’re doing is you’re relying on the early adopters’ skill set and figuring out how to take this data and connect it to business problems. So, in the early days, we didn’t do anything because the market we were selling to was very, very savvy; they’re hungry people, they just like new things. They’re getting data, they’re feeling really, really smart, everything’s working great. As you get bigger and bigger and bigger, you start to try to sell to a bigger TAM, a bigger audience, you start trying to talk to the these early majorities, which are, they’re not early adopters, they’re more technology laggards in some degree, and they don’t understand how to use data to inform their job. They’ve never used data to inform their job. There, we’ve had to do a lot more work.” Todd (2:04 - 2:58) “I think AI is amazing, and I don’t want to say AI is overhyped because AI in general is—yeah, it’s the revolution that we all have to pay attention to. Do I think that the skills necessary to be an AI product manager are so distinct that you need to hire differently? No, I don’t. That’s not what I’m seeing. If you have a really curious product manager who’s going all in, I think you’re going to be okay. Some of the most AI-forward work happening at Pendo is not just product management. Our design team is going crazy. And I think one of the things that we’re seeing is a blend between design and product, that they’re always adjacent and connected; there’s more sort of overlappiness now.” Todd (22:41 - 23:28) “I think about things like stickiness, which may not be an aggregate time, but how often are people coming back and checking in? And if you had this companion or this agent that you just could not live without, and it caused you to come into the product almost every day just to check in, but it’s a fast check-in, like, a five-minute check-in, a ten-minute check-in, that’s pretty darn sticky. That’s a good metric. So, I like stickiness as a metric because it’s measuring [things like], “Are you thinking about this product a lot?” And if you’re thinking about it a lot, and like, you can’t kind of live without it, you’re going to go to it a lot, even if it’s only a few minutes a day. Social media is like that. Thankfully I’m not addicted to TikTok or Instagram or anything like that, but I probably check it nearly every day. That’s a pretty good metric. It gets part of my process of any products that you’re checking every day is pretty darn good. So yeah, but I think we need to reframe the conversation not just total time. Like, how are we measuring outcomes and value, and I think that’s what’s ultimately going to win here.” Todd (39:57)

Links

LinkedIn: https://www.linkedin.com/in/toddaolson/  X: https://x.com/tolson  [email protected] 

Today I’m joined by Marnix van de Stolpe, Product Owner at Coolblue in the area of data science. Throughout our conversation, Marnix shares the story of how he joined a data science team that was developing a solution that was too focused on the delivery of a data-science metric that was not on track to solve a clear customer problem. We discuss how Marnix came to the difficult decision to throw out 18 months of data science work, what it was like to switch to a human-centered, product approach, and the challenges that came with it. Marnix shares the impact this decision had on his team and the stakeholders involved, as well as the impact on his personal career and the advice he would give to others who find themselves in the same position. Marnix is also a Founding Member of the Data Product Leadership Community and will be going much more into the details and his experience live on Zoom on November 16 @ 2pm ET for members.

Highlights/ Skip to:

I introduce Marnix, Product Owner at Coolblue and one of the original members of the Data Product Leadership Community (00:35) Marnix describes what Coolblue does and his role there (01:20) Why and how Marnix decided to throw away 18 months of machine learning work (02:51) How Marnix determined that the KPI (metric) being created wasn’t enough to deliver a valuable product (07:56) Marnix describes the conversation with his data science team on mapping the solution back to the desired outcome (11:57) What the culture is like at Coolblue now when developing data products (17:17) Marnix’s advice for data product managers who are coming into an environment where existing work is not tied to a desired outcome (18:43) Marnix and I discuss why data literacy is not the solution to making more impactful data products (21:00) The impact that Marnix’s human-centered approach to data product development has had on the stakeholders at Coolblue (24:54) Marnix shares the ultimate outcome of the product his team was developing to measure product returns (31:05) How you can get in touch with Marnix (33:45)

Links Coolblue: https://www.coolblue.nl LinkedIn: https://www.linkedin.com/in/marnixvdstolpe/

Today I’m chatting with Osian Jones, Head of Product for the Data Platform at Stuart. Osian describes how impact and ROI can be difficult metrics to measure in a data platform, and how the team at Stuart has sought to answer this challenge. He also reveals how user experience is intrinsically linked to adoption and the technical problems that data platforms seek to solve. Throughout our conversation, Osian shares a holistic overview of what it was like to design a data platform from scratch, the lessons he’s learned along the way, and the advice he’d give to other data product managers taking on similar projects. 

Highlights/ Skip to:

Osian describes his role at Stuart (01:36) Brian and Osian explore the importance of creating an intentional user experience strategy (04:29) Osian explains how having a clear mission enables him to create parameters to measure product success (11:44) How Stuart developed the KPIs for their data platform (17:09) Osian gives his take on the pros and cons of how data departments are handled in regards to company oversight (21:23) Brian and Osian discuss how vital it is to listen to your end users rather than relying on analytics alone to measure adoption (26:50) Osian reveals how he and his team went about designing their platform (31:33) What Osian learned from building out the platform and what he would change if he had to tackle a data product like this all over again (36:34)

Quotes from Today’s Episode “Analytics has been treated very much as a technical problem, and very much so on the data platform side, which is more on the infrastructure and the tooling to enable analytics to take place. And so, viewing that purely as a technical problem left us at odds in a way, compared to [teams that had] a product leader, where the user was the focus [and] the user experience was very much driving a lot of what was roadmap.” — Osian Jones (03:15)

“Whenever we get this question of what’s the impact? What’s the value? How does it impact our company top line? How does it impact our company OKRs? This is when we start to panic sometimes, as data platform leaders because that’s an answer that’s really challenging for us, simply because we are mostly enablers for analytics teams who are themselves enablers. It’s almost like there’s two different degrees away from the direct impact that your team can have.” — Osian Jones (12:45)

“We have to start with a very clear mission. And our mission is to empower everyone to make the best data-driven decisions as fast as possible. And so, hidden within there, that’s a function of reducing time to insight, it’s also about maximizing trust and obviously minimizing costs.” — Osian Jones (13:48)

“We can track [metrics like reliability, incidents, time to resolution, etc.], but also there is a perception aspect to that as well. We can’t underestimate the importance of listening to our users and qualitative data.” — Osian Jones (30:16)

“These were questions that I felt that I naturally had to ask myself as a product manager. … Understanding who our users are, what they are trying to do with data and what is the current state of our data platform—so those were the three main things that I really wanted to get to the heart of, and connecting those three things together.” – Osian Jones (35:29)

“The advice that I would give to anyone who is taking on the role of a leader of a data platform or a similar role is, you can easily get overwhelmed by just so many different use cases. And so, I would really encourage [leaders] to avoid that.” – Osian Jones (37:57)

“Really look at your data platform from an end-user perspective and almost think of it as if you were to put the data platform on a supermarket shelf, what would that look like? And so, for each of the different components, how would you market that in a single one-liner in terms of what can this do for me?” – Osian Jones (39:22)

Links Stuart: https://stuart.com/ Article on IIA: https://iianalytics.com/community/blog/how-to-build-a-data-platform-as-a-product-a-retrospective Experiencing Data Episode 80 with Doug Hubbard: https://designingforanalytics.com/resources/episodes/080-how-to-measure-the-impact-of-data-productsand-anything-else-with-forecasting-and-measurement-expert-doug-hubbard/ LinkedIn: https://www.linkedin.com/in/osianllwydjones/ Medium: https://medium.com/@osianllwyd

Today, I’m flying solo in order to introduce you to CED: my three-part UX framework for designing your ML / predictive / prescriptive analytics UI around trust, engagement, and indispensability. Why this, why now? I have had several people tell me that this has been incredibly helpful to them in designing useful, usable analytics tools and decision support applications. 

I have written about the CED framework before at the following link:

https://designingforanalytics.com/ced

There you will find an example of the framework put into a real-world context. In this episode, I wanted to add some extra color to what is discussed in the article. If you’re an individual contributor, the best part is that you don’t have to be a professional designer to begin applying this to your own data products. And for leaders of teams, you can use the ideas in CED as a “checklist” when trying to audit your team’s solutions in the design phase—before it’s too late or expensive to make meaningful changes to the solutions. 

CED is definitely easier to implement if you understand the basics of human-centered design, including research, problem finding and definition, journey mapping, consulting, and facilitation etc. If you need a step-by-step method to develop these foundational skills, my training program, Designing Human-Centered Data Products, might help. It comes in two formats: a Self-Guided Video Course and a bi-annual Instructor-Led Seminar.

Quotes from Today’s Episode “‘How do we visualize the data?’ is the wrong starting question for designing a useful decision support application. That makes all kinds of assumptions that we have the right information, that we know what the users' goals and downstream decisions are, and we know how our solution will make a positive change in the customer or users’ life.”- Brian (@rhythmspice) (02:07)

“The CED is a UX framework for designing analytics tools that drive decision-making. Three letters, three parts: Conclusions; C, Evidence: E, and Data: D. The tough pill for some technical leaders to swallow is that the application, tool or product they are making may need to present what I call a ‘conclusion’—or if you prefer, an ‘opinion.’ Why? Because many users do not want an ‘exploratory’ tool—even when they say they do. They often need an insight to start with, before exploration time  becomes valuable.” - Brian (@rhythmspice) (04:00)

“CED requires you to do customer and user research to understand what the meaningful changes, insights, and things that people want or need actually are. Well designed ‘Conclusions’—when experienced in an analytics tool using the CED framework—often manifest themselves as insights such as unexpected changes, confirmation of expected changes, meaningful change versus meaningful benchmarks, scoring how KPIs track to predefined and meaningful ranges, actionable recommendations, and next best actions. Sometimes these Conclusions are best experienced as charts and visualizations, but not always—and this is why visualizing the data rarely is the right place to begin designing the UX.” - Brian (@rhythmspice) (08:54)

“If I see another analytics tool that promises ‘actionable insights’ but is primarily experienced as a collection of gigantic data tables with 10, 20, or 30+ columns of data to parse, your design is almost certainly going to frustrate, if not alienate, your users. Not because all table UIs are bad, but because you’ve put a gigantic tool-time tax on the user, forcing them to derive what the meaningful conclusions should be.”   - Brian (@rhythmspice) (20:20)