talk-data.com talk-data.com

Topic

Jira

project_management issue_tracking agile

2

tagged

Activity Trend

5 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Brian O’Neill ×

Today I am going to try to answer a fundamental question: how should you actually measure user experience, especially with data products—and tie this to business value? It's easy to get lost in analytics and think we're seeing the whole picture, but I argue that this is far from the truth. Product leaders need to understand the subjective experience of our users—and unfortunately, analytics does not tell us this.

The map is not the territory.

In this episode, I discuss why qualitative data and subjective experience is the data that will most help you make product decisions that will lead you to increased business value. If users aren't getting value from your product(s), and their lives aren’t improving, business value will be extremely difficult to create. So today, I share my thoughts on how to move beyond thinking that analytics is the only way to track UX, and how this helps product leaders uncover opportunities to produce better organizational value. 

Ultimately, it’s about creating indispensable solutions and building trust, which is key for any product team looking to make a real impact. Hat tip to UX guru Jared Spool who inspired several of the concepts I share with you today.

Highlights/ Skip to 

Don't target adoption for adoption's sake, because product usage can be a tax or benefit (3:00) Why your analytical mind may bias you—and what changes you might have to do this type of product and user research work (7:31) How "making the user's life better" translates to organizational value (10:17) Using Jared Spool's roller coaster chart to measure your product’s user experience and find your opportunities and successes (13:05) How do you measure that you have done a good job with your UX? (17:28)  Conclusions and final thoughts (21:06)

Quotes from Today’s Episode

Usage doesn't automatically equal value. Analytics on your analytics is not telling you useful things about user experience or satisfaction. Why? "The map is not the territory." Analytics measure computer metrics, not feelings, and let's face it, users aren't always rational. To truly gauge user value, we need qualitative research - to talk to users - and to hear what their subjective experience is. Want meaningful adoption? Talk to and observe your users. That's how you know you are actually making things better. When it’s better for them, the business value will follow. (3:12) Make better things—where better is a measurement based on the subjective experience of the user—not analytics. Usable doesn’t mean they will necessarily want it. Sessions and page views don’t tell you how people feel about it. (7:39) Think about the dreadful tools you and so many have been forced to use: the things that waste your time and don’t let you focus on what’s really important. Ever talked to a data scientist who is sick of doing data prep instead of building models, and wondering, “why am I here? This isn’t what I went to school for.” Ignoring these personal frustrations and feelings and focusing only on your customers’ feature requests, JIRA tickets, stakeholder orders, requirements docs, and backlog items is why many teams end up building technically right, effectively wrong solutions. These end user frustrations are where we find our opportunities to delight—and create products and UXs that matter. To improve their lives, we need to dig into their workflows, identify frustrations, and understand the context around our data product solutions. Product leaders need to fall in love with the problems and the frustrations—these are the magic keys to the value kingdom. However, to do this well, you probably need to be doing less delivery and more discovery. (10:27) Imagine a line chart with a Y-axis that is "frustration" at the bottom to "delight" at the top. The X-axis is their user experience, taking place over time. As somebody uses your data product to do their job/task, you can plot their emotional journey. “Get the data, format the data, include the data in a tool, derive some conclusion, challenge the data, share it, make a decision” etc. As a product manager, you probably know what a use-case looks like. Your first job is to plot their existing experience trying/doing that use case with your data product. Where are they frustrated? Where are they delighted? Celebrate your peaks/delighters, and fall in love with the valleys where satisfaction work needs to be done. Connect the dots between these valleys and business value. Address the valleys—especially the ones that impede business value—and you’ll be on your way to “showing the value of your data product.” Analytics on your data product won’t tell you this information; the map is not the territory. (13:22) Analytics about your data product are lying to you. They give you the facts about the product, but not about the user. An example? “Time spent” doing a task. How long is too long? 5 minutes? 50? Analytics will tell you precisely how long it took. The problem is, it won’t tell you how long it FELT it took. And guess what? Your customers and users only care about how long it felt it took—vs. their expectation. Sure, at some point, analytics might eventually help—at scale—understand how your data product is doing—but first you have to understand how people FEEL about it. Only then will you know whether 5 minutes, or 50 minutes is telling you anything meaningful about what—if anything—needs to change. (16:17)

A challenge I frequently hear about from subscribers to my insights mailing list is how to design B2B data products for multiple user types with differing needs. From dashboards to custom apps and commercial analytics / AI products, data product teams often struggle to create a single solution that meets the diverse needs of technical and business users in B2B settings. If you're encountering this issue, you're not alone!

In this episode, I share my advice for tackling this challenge including the gift of saying "no.” What are the patterns you should be looking out for in your customer research? How can you choose what to focus on with limited resources? What are the design choices you should avoid when trying to build these products? I’m hoping by the end of this episode, you’ll have some strategies to help reduce the size of this challenge—particularly if you lack a dedicated UX team to help you sort through your various user/stakeholder demands. 

Highlights/ Skip to 

The importance of proper user research and clustering “jobs to be done” around business importance vs. task frequency—ignoring the rest until your solution can show measurable value  (4:29) What “level” of skill to design for, and why “as simple as possible” isn’t what I generally recommend (13:44) When it may be advantageous to use role or feature-based permissions to hide/show/change certain aspects, UI elements, or features  (19:50) Leveraging AI and LLMs in-product to allow learning about the user and progressive disclosure and customization of UIs (26:44) Leveraging the “old” solution of rapid prototyping—which is now faster than ever with AI, and can accelerate learning (capturing user feedback) (31:14) 5 things I do not recommend doing when trying to satisfy multiple user types in your b2b AI or analytics product (34:14)

Quotes from Today’s Episode

If you're not talking to your users and stakeholders sufficiently, you're going to have a really tough time building a successful data product for one user – let alone for multiple personas. Listen for repeating patterns in what your users are trying to achieve (tasks they are doing). Focus on the jobs and tasks they do most frequently or the ones that bring the most value to their business. Forget about the rest until you've proven that your solution delivers real value for those core needs. It's more about understanding the problems and needs, not just the solutions. The solutions tend to be easier to design when the problem space is well understood. Users often suggest solutions, but it's our job to focus on the core problem we're trying to solve; simply entering in any inbound requests verbatim into JIRA and then “eating away” at the list is not usually a reliable strategy. (5:52) I generally recommend not going for “easy as possible” at the cost of shallow value. Instead, you’re going to want to design for some “mid-level” ability, understanding that this may make early user experiences with the product more difficult. Why? Oversimplification can mislead because data is complex, problems are multivariate, and data isn't always ideal. There are also “n” number of “not-first” impressions users will have with your product. This also means there is only one “first impression” they have. As such, the idea conceptually is to design an amazing experience for the “n” experiences, but not to the point that users never realize value and give up on the product.  While I'd prefer no friction, technical products sometimes will have to have a little friction up front however, don't use this as an excuse for poor design. This is hard to get right, even when you have design resources, and it’s why UX design matters as thinking this through ends up determining, in part, whether users obtain the promise of value you made to them. (14:21) As an alternative to rigid role and feature-based permissions in B2B data products, you might consider leveraging AI and / or LLMs in your UI as a means of simplifying and customizing the UI to particular users. This approach allows users to potentially interrogate the product about the UI, customize the UI, and even learn over time about the user’s questions (jobs to be done) such that becomes organically customized over time to their needs. This is in contrast to the rigid buckets that role and permission-based customization present. However, as discussed in my previous episode (164 - “The Hidden UX Taxes that AI and LLM Features Impose on B2B Customers Without Your Knowledge”)  designing effective AI features and capabilities can also make things worse due to the probabilistic nature of the responses GenAI produces. As such, this approach may benefit from a UX designer or researcher familiar with designing data products. Understanding what “quality” means to the user, and how to measure it, is especially critical if you’re going to leverage AI and LLMs to make the product UX better. (20:13) The old solution of rapid prototyping is even more valuable now—because it’s possible to prototype even faster. However, prototyping is not just about learning if your solution is on track. Whether you use AI or pencil and paper, prototyping early in the product development process should be framed as a “prop to get users talking.” In other words, it is a prop to facilitate problem and need clarity—not solution clarity. Its purpose is to spark conversation and determine if you're solving the right problem. As you iterate, your need to continually validate the problem should shrink, which will present itself in the form of consistent feedback you hear from end users. This is the point where you know you can focus on the design of the solution. Innovation happens when we learn; so the goal is to increase your learning velocity. (31:35) Have you ever been caught in the trap of prioritizing feature requests based on volume? I get it. It's tempting to give the people what they think they want. For example, imagine ten users clamoring for control over specific parameters in your machine learning forecasting model. You could give them that control, thinking you're solving the problem because, hey, that's what they asked for! But did you stop to ask why they want that control? The reasons behind those requests could be wildly different. By simply handing over the keys to all the model parameters, you might be creating a whole new set of problems. Users now face a "usability tax," trying to figure out which parameters to lock and which to let float. The key takeaway? Focus on addressing the frequency that the same problems are occurring across your users, not just the frequency a given tactic or “solution” method (i.e. “model” or “dashboard” or “feature”) appears in a stakeholder or user request. Remember, problems are often disguised as solutions. We've got to dig deeper and uncover the real needs, not just address the symptoms. (36:19)