talk-data.com talk-data.com

B

Speaker

Brian O’Neill

25

talks

Podcast host Designing for Analytics

Frequent Collaborators

Filtering by: Experiencing Data w/ Brian T. O’Neill (AI & data product management leadership—powered by UX design) ×

Filter by Event / Source

Talks & appearances

Showing 100 of 102 activities

Search activities →

Today I’m chatting with Bruno Aziza, Head of Data & Analytics at Google Cloud. Bruno leads a team of outbound product managers in charge of BigQuery, Dataproc, Dataflow and Looker and we dive deep on what Bruno looks for in terms of skills for these leaders. Bruno describes the three patterns of operational alignment he’s observed in data product management, as well as why he feels ownership and customer obsession are two of the most important qualities a good product manager can have. Bruno and I also dive into how to effectively abstract the core problem you’re solving, as well as how to determine whether a problem might be solved in a better way. 

Highlights / Skip to:

Bruno introduces himself and explains how he created his “CarCast” podcast (00:45) Bruno describes his role at Google, the product managers he leads, and the specific Google Cloud products in his portfolio (02:36) What Bruno feels are the most important attributes to look for in a good data product manager (03:59) Bruno details how a good product manager focuses on not only the core problem, but how the problem is currently solved and whether or not that’s acceptable (07:20) What effective abstracting the problem looks like in Bruno’s view and why he positions product management as a way to help users move forward in their career (12:38) Why Bruno sees extracting value from data as the number one pain point for data teams and their respective companies (17:55) Bruno gives his definition of a data product (21:42) The three patterns Bruno has observed of operational alignment when it comes to data product management (27:57) Bruno explains the best practices he’s seen for cross-team goal setting and problem-framing (35:30)

Quotes from Today’s Episode  

“What’s happening in the industry is really interesting. For people that are running data teams today and listening to us, the makeup of their teams is starting to look more like what we do [in] product management.” — Bruno Aziza (04:29)

“The problem is the problem, so focus on the problem, decompose the problem, look at the frictions that are acceptable, look at the frictions that are not acceptable, and look at how by assembling a solution, you can make it most seamless for the individual to go out and get the job done.” – Bruno Aziza (11:28)

“As a product manager, yes, we’re in the business of software, but in fact, I think you’re in the career management business. Your job is to make sure that whatever your customer’s job is that you’re making it so much easier that they, in fact, get so much more done, and by doing so they will get promoted, get the next job.” – Bruno Aziza (15:41)

“I think that is the task of any technology company, of any product manager that’s helping these technology companies: don’t be building a product that’s looking for a problem. Just start with the problem back and solution from that. Just make sure you understand the problem very well.” (19:52)

“If you’re a data product manager today, you look at your data estate and you ask yourself, ‘What am I building to save money? When am I building to make money?’ If you can do both, that’s absolutely awesome. And so, the data product is an asset that has been built repeatedly by a team and generates value out of data.” – Bruno Aziza (23:12)

“[Machine learning is] hard because multiple teams have to work together, right? You got your business analyst over here, you’ve got your data scientists over there, they’re not even the same team. And so, sometimes you’re struggling with just the human aspect of it.” (30:30)

“As a data leader, an IT leader, you got to think about those soft ways to accomplish the stuff that’s binary, that’s the hard [stuff], right? I always joke, the hard stuff is the soft stuff for people like us because we think about data, we think about logic, we think, ‘Okay if it makes sense, it will be implemented.’ For most of us, getting stuff done is through people. And people are emotional, how can you express the feeling of achieving that goal in emotional value?” – Bruno Aziza (37:36)

Links As referenced by Bruno, “Good Product Manager/Bad Product Manager”: https://a16z.com/2012/06/15/good-product-managerbad-product-manager/ LinkedIn: https://www.linkedin.com/in/brunoaziza/ Bruno’s Medium Article on Competing Against Luck by Clayton M. Christensen: https://brunoaziza.medium.com/competing-against-luck-3daeee1c45d4 The Data CarCast on YouTube:  https://www.youtube.com/playlist?list=PLRXGFo1urN648lrm8NOKXfrCHzvIHeYyw

Today I’m chatting with returning guest Tom Davenport, who is a Distinguished Professor at Babson College, a Visiting Professor at Oxford, a Research Fellow at MIT, and a Senior Advisor to Deloitte’s AI practice. He is also the author of three new books (!) on AI and in this episode, we’re discussing the role of product orientation in enterprise data science teams, the skills required, what he’s seeing in the wild in terms of teams adopting this approach, and the value it can create. Back in episode 26, Tom was a guest on my show and he gave the data science/analytics industry an approximate “2 out of 10” rating in terms of its ability to generate value with data. So, naturally, I asked him for an update on that rating, and he kindly obliged. How are you all doing? Listen in to find out!

Highlights / Skip to:

Tom provides an updated rating (between 1-10) as to how well he thinks data science and analytics teams are doing these days at creating economic value (00:44) Why Tom believes that “motivation is not enough for data science work” (03:06) Tom provides his definition of what data products are and some opinions on other industry definitions (04:22) How Tom views the rise of taking a product approach to data roles and why data products must be tied to value (07:55) Tom explains why he feels top down executive support is needed to drive a product orientation (11:51) Brian and Tom discuss how they feel companies should prioritize true data products versus more informal AI efforts (16:26) The trends Tom sees in the companies and teams that are implementing a data product orientation (19:18) Brian and Tom discuss the models they typically see for data teams and their key components (23:18) Tom explains the value and necessity of data product management (34:49) Tom describes his three new books (39:00)

Quotes from Today’s Episode “Data science in general, I think has been focused heavily on motivation to fit lines and curves to data points, and that particular motivation certainly isn’t enough in that even if you create a good model that fits the data, it doesn’t mean at all that is going to produce any economic value.” – Tom Davenport  (03:05)

“If data scientists don’t worry about deployment, then they’re not going to be in their jobs for terribly long because they’re not providing any value to their organizations.” – Tom Davenport (13:25)

“Product also means you got to market this thing if it’s going to be successful. You just can’t assume because it’s a brilliant algorithm with capturing a lot of area under the curve that it’s somehow going to be great for your company.” – Tom Davenport (19:04)

“[PM is] a hard thing, even for people in non-technical roles, because product management has always been a sort of ‘minister without portfolio’ sort of job, and you know, influence without formal authority, where you are responsible for a lot of things happening, but the people don’t report to you, generally.” – Tom Davenport (22:03)

“This collaboration between a human being making a decision and an AI system that might in some cases come up with a different decision but can’t explain itself, that’s a really tough thing to do [well].” – Tom Davenport (28:04)

“This idea that we’re going to use externally-sourced systems for ML is not likely to succeed in many cases because, you know, those vendors didn’t work closely with everybody in your organization” – Tom Davenport (30:21)

“I think it’s unlikely that [organizational gaps] are going to be successfully addressed by merging everybody together in one organization. I think that’s what product managers do is they try to address those gaps in the organization and develop a process that makes coordination at least possible, if not true, all the time.” – Tom Davenport (36:49)

Links Tom’s LinkedIn: https://www.linkedin.com/in/davenporttom/ Tom’s Twitter: https://twitter.com/tdav All-in On AI by Thomas Davenport & Nitin Mittal, 2023 Working With AI by Thomas Davenport & Stephen Miller, 2022 Advanced Introduction to AI in Healthcare by Thomas Davenport, John Glaser, & Elizabeth Gardner, 2022 Competing On Analytics by Thomas Davenport & Jeanne G. Harris, 2007

Today I’m chatting with former-analyst-turned-design-educator Jeremy Utley of the Stanford d.school and co-author of Ideaflow. Jeremy reveals the psychology behind great innovation, and the importance of creating psychological safety for a team to generate what they may view as bad ideas. Jeremy speaks to the critical collision of unrelated frames of reference when problem-solving, as well as why creativity is actually more of a numbers game than awaiting that singular stroke of genius. Listen as Jeremy gives real-world examples of how to practice and measure (!) your innovation efforts and apply them to data products.

Highlights/ Skip to:

Jeremy explains the methodology of thinking he’s adopted after moving from highly analytical roles to the role he’s in now (01:38) The approach Jeremy takes to the existential challenge of balancing innovation with efficiency (03:54) Brian shares a story of a creative breakthrough he had recently and Jeremy uses that to highlight how innovation often comes in a way contrary to normalcy and professionalism (09:37) Why Jeremy feels innovation and creativity demand multiple attempts at finding solutions (16:13) How to take a innovation-forward approach like the ones Jeremy has described when working on internal tool development (19:33) Jeremy’s advice for accelerating working through bad ideas to get to the good ideas (25:18) The approach Jeremy takes to generate a large volume of ideas, rather than focusing only on “good” ideas, including a real-life example (31:54) Jeremy’s beliefs on the importance of creating psychological safety to promote innovation and creative problem-solving (35:11)

Quotes from Today’s Episode “I’m in spreadsheets every day to this day, but I recognize that there’s a time and place when that’s the tool that’s needed, and then specifically, there’s a time and a place where that’s not going to help me and the answer is not going to be found in the spreadsheet.” – Jeremy Utley (03:13)

“There’s the question of, ‘Are we doing it right?’ And then there’s a different question, which is, ‘Are we doing the right “it”?’ And I think a lot of us tend to fixate on, ‘Are we doing it right?’ And we have an ability to perfectly optimize that what should not be done.” – Jeremy Utley (05:05)

“I think a vendetta that I have is against this wrong placement of—this exaltation of efficiency is the end-all, be-all. Innovation is not efficient. And the question is not how can I be efficient. It’s what is effective. And effectiveness, oftentimes when it comes to innovation and breaking through, doesn’t feel efficient.” – Jeremy Utley (09:17)

“The way the brain works, we actually understand it. The way breakthroughs work we actually understand them. The difficulty is it challenges our definitions of efficiency and professionalism and all of these things.” – Jeremy Utley (15:13)

“What’s the a priori probability that any solution is the right solution? Or any idea is a good idea? It’s exceptionally low. You have to be exceptionally arrogant to think that most of your ideas are good. They’re not. That’s fine, we don’t mind because then what’s efficient is actually to generate a lot.” – Jeremy Utley (26:20)

“If you don’t learn that nothing happens when the ball hits the floor, you can never learn how to juggle. And to me, it’s a really good metaphor. The teams that don’t learn nothing happens when they have a bad idea. Literally, the world does not end. They don’t get fired. They don’t get ridiculed. Now, if they do get fired or ridiculed, that’s a leadership problem.” – Jeremy Utley (35:59)

[The following] is an essential question for a team leader to ask. Do people on my team have the freedom, at least with me, to share what they truly fear could be an incredibly stupid idea?” – Jeremy Utley (41:52)

Links Ideaflow: https://www.amazon.com/Ideaflow-Only-Business-Metric-Matters-ebook/dp/B09R6M3292 Ideaflow website: https://ideaflow.design Personal webpage: https://jeremyutley.design LinkedIn: https://www.linkedin.com/in/jeremyutley/ Twitter: https://twitter.com/jeremyutley/ Brian’s musical arrangement of Gershwin’s “Prelude for Piano IIfeaturing the Siamese Cat Song” performed by Mr. Ho’s Orchestrotica - listen on Spotify

Today I’m discussing something we’ve been talking about a lot on the podcast recently - the definition of a “data product.” While my definition is still a work in progress, I think it’s worth putting out into the world at this point to get more feedback. In addition to sharing my definition of data products (as defined the “producty way”), on today’s episode definition, I also discuss some of the non-technical skills that data product managers (DPMs) in the ML and AI space need if they want to achieve good user adoption of their solutions. I’ll also share my thoughts on whether data scientists can make good data product managers, what a DPM can do to better understand your users and stakeholders, and how product and UX design factors into this role. 

Highlights/ Skip to:

I introduce my reasons for sharing my definition of a data product (0:46) My definition of data product (7:26) Thinking the “producty” way (8:14) My thoughts on necessary skills for data PMs (in particular, AI & machine learning product management) (12:21) How data scientists can become good data product managers (DPMs) by taking off the data science hat (13:42) Understanding the role of UX design within the context of DPM (16:37) Crafting your sales and marketing strategies to emphasize the value of your product to the people who can use or purchase it (23:07) How to build a team that will help you increase adoption of your data product (30:01) How to build relationships with stakeholders/customers that allow you to find the right solutions for them (33:47) Letting go of a technical identity to develop a new identity as a DPM who can lead a team to build a product that actually gets used (36:32)

Quotes from Today’s Episode “This is what’s missing in some of the other definitions that I see around data products  [...] they’re not talking about it from the customer of the data product lens. And that orientation sums up all of the work that I’m doing and trying to get you to do as well, which is to put the people at the center of the work that you’re doing and not the data science, engineering, tech, or design. I want you to put the people at the center.” (6:12) “A data product is a data-driven, end-to-end, human-in-the-loop decision support solution that’s so valuable, users would potentially pay to use it.” (7:26) “I want to plunge all the way in and say, ‘if you want to do this kind of work, then you need to be thinking the product-y way.’ And this means inherently letting go of some of the data science-y way of thinking and the data-first kinds of ways of thinking.” (11:46) “I’ve read in a few places that data scientists don’t make for good data product managers. [While it may be true that they’re more introverted,] I don’t think that necessarily means that there’s an inherent problem with data scientists becoming good data product managers. I think the main challenge will be—and this is the same thing for almost any career transitioning into product management—is knowing when to let go of your former identity and wear the right hat at the right time.” (14:24) “Make better things for people that will improve their life and their outcomes and the business value will follow if you’ve properly aligned those two things together.” (17:21) “The big message here is this: there is always a design and experience, whether it is an API, or a platform, a dashboard, a full application, etc. Since there are no null design choices, how much are you going to intentionally shape that UX, or just pray that it comes out good on the other end? Prayer is not really a reliable strategy.  If you want to routinely do this work right, you need to put intention behind it.” (22:33)  “Relationship building is a must, and this is where applying user experience research can be very useful—not just for users, but also with stakeholders. It’s learning how to ask really good questions and learning the feelings, emotions, and reasons why people ask your team to build the thing that they’ve asked for. Learning how to dig into that is really important.” (26:26)

Links Designing for Analytics Community Work With Me Email Record a question

Today I’m chatting with Indi Young, independent qualitative data scientist and author of Time to Listen. Indi explains how it is possible to gather and analyze qualitative data in a way that is meaningful to the desired future state of your users, and that learning how to listen and not just interview users is much like learning to ride a bicycle. Listen (!) to find out why pushing back is a necessary part of the design research process, how to build an internal sensor that allows you to truly uncover the nuggets of information that are critical to your projects, and the importance of understanding thought processes to prevent harmful outcomes.

Highlights/ Skip to:

Indi introduces her perspective on analyzing qualitative data sets (00:51) Indi’s motivation for working in design research and the importance of being able to capture and understand patterns to prevent harmful outcomes (05:09) The process Indi goes through for problem framing and understanding a user’s desired future state (11:11) Indi explains how to listen effectively in order to understand the thinking style of potential end users (15:42) Why Indi feels pushing back on problems within projects is a vital part of taking responsibility and her recommendations for doing so effectively (21:45) The importance Indi sees in building up a sensor in order to be able to detect nuggets clients give you for their upcoming projects (28:25) The difference in techniques Indi observes between an interview, a listening session, and a survey (33:13) Indi describes her published books and reveals which one she’d recommend listeners start with (37:34)

Quotes from Today’s Episode “A lot of qualitative data is not trusted, mainly because the people who are doing the not trusting have encountered bad qualitative data.” — Indi Young (03:23)

“When you’re learning to ride a bike, when you’re learning to decide what knowledge is needed, you’re probably going to burn through a bunch of money-making knowledge that never gets used. So, that’s when you start to learn, ‘I need to frame this better, and to frame it, I can’t do it by myself.’” – Indi Young (11:57)

“What you want to do is get beyond the exterior and get to the interior, which is where somebody tells you what actually went through their mind when they did that thing in the past, not what’s going through their mind right now. And it’s that’s a very important distinction.” – Indi Young (20:28)

“Re: dealing with stakeholders: You’re not doing your job if you don’t push back. You built up a lot of experience, you got hired, they hired you and your thinking and your experience, and if what went through your mind is, like, ‘This is wrong,’ but you don’t act on it, then they should not pay you a salary.” – Indi Young (22:45)

“I’ve seen a lot of people leave their perfectly promising career because it was too hard to get to the point of accepting that you have to network, that I’m not going to be that one-in-a-million person who’s the brilliant person with a brilliant idea and get my just rewards that way.” – Indi Young (25:13)

“What’s really interesting about a listening session is that it doesn’t—aside from building this sensor and learning what the techniques are for helping a person get to their interior cognition rather than that exterior … to get past that into the inner thinking, the emotional reactions, and the guiding principles, aside from the sensor and those techniques, there’s not much to it.” – Indi Young (32:45) “And once you start building that [sensor], and this idea of just having one generative question about the purpose—because the whole thing is framed by the purpose—there you go. Get started. You have to practice. So, it’s like riding a bike. Go for it. You won’t have those sensors at first, but you’ll start to learn how to build them.” – Indi Young (36:41)

Links Referenced: Time to Listen: https://www.amazon.com/Time-Listen-Invention-Inclusion-Assumptions/dp/1944627111 Mental Models: https://www.amazon.com/Mental-Models-Aligning-Strategy-Behavior/dp/1933820063 Practical Empathy: https://www.amazon.com/Practical-Empathy-Collaboration-Creativity-Your/dp/1933820489 indiyoung.com: https://indiyoung.com LinkedIn: https://www.linkedin.com/in/indiyoung/ Instagram: https://www.instagram.com/indiyoung_/

Today I’m chatting with Eugenio Zuccarelli, Research Scientist at MIT Media Lab and Manager of Data Science at CVS. Eugenio explains how he has created multiple algorithms designed to help shape decisions made in life or death situations, such as pediatric cardiac surgery and during the COVID-19 pandemic. Eugenio shared the lessons he’s learned on how to build trust in data when the stakes are life and death. Listen and learn how culture can affect adoption of decision support and ML tools, the impact delivery of information has on the user's ability to understand and use data, and why Eugenio feels that design is more important than the inner workings of ML algorithms.

Highlights/ Skip to:

Eugenio explains why he decided to work on machine learning models for cardiologists and healthcare workers involved in the COVID-19 pandemic (01:53)  The workflow surgeons would use when incorporating the predictive algorithm and application Eugenio helped develop (04:12) The question Eugenio’s predictive algorithm helps surgeons answer when evaluating whether to use various pediatric cardiac surgical procedures (06:37) The path Eugenio took to build trust with experienced surgeons and drive product adoption and the role of UX (09:42) Eugenio’s approach to identifying key problems and finding solutions using data (14:50) How Eugenio has tracked value delivery and adoption success for a tool that relies on more than just accurate data & predictions, but also surgical skill and patient case complexity (22:26) The design process Eugenio started early on to optimize user experience and adoption (28:40) Eugenio’s key takeaways from a different project that helped government agencies predict what resources would be needed in which areas during the COVID-19 pandemic (34:45)

Quotes from Today’s Episode “So many people today are developing machine-learning models, but I truly find the most difficult parts to be basically everything around machine learning … culture, people, stakeholders, products, and so on.” — Eugenio Zuccarelli (01:56)

“Developing machine-learning components, clean data, developing the machine-learning pipeline, those were the easy steps. The difficult ones who are gaining trust, as you said, developing something that was useful. And talking about trust, it’s especially tricky in the healthcare industry.” — Eugenio Zuccarelli (10:42)

“Because this tennis match, this ping-pong match between what can be done and what’s [the] problem [...] thankfully, we know, of course, it is not really the route to go. We don’t want to develop technology for the sake of it.” — Eugenio Zuccarelli (14:49)

“We put so much effort on the machine-learning side and then the user experience is so key, it’s probably even more important than the inner workings.” — Eugenio Zuccarelli (29:22)

“It was interesting to see exactly how the doctor is really focused on their job and doing it as well as they can, not really too interested in fancy [...] solutions, and so we were really able to not focus too much on appearance or fancy components, but more on usability and readability.” — Eugenio Zuccarelli (33:45)

“People’s ability to trust data, and how this varies from a lot of different entities, organizations, countries, [etc.] This really makes everything tricky. And of course, when you have a pandemic, this acts as a catalyst and enhances all of these cultural components.” — Eugenio Zuccarelli (35:59)

“I think [design success] boils down to delivery. You can package the same information in different ways [so that] it actually answers their questions in the ways that they’re familiar with.” — Eugenio Zuccarelli (37:42)

Links LinkedIn: https://www.linkedin.com/in/jayzuccarelli Twitter: twitter.com/jayzuccarelli Personal website: https://eugeniozuccarelli.com Medium: jayzuccarelli.medium.com

Today I’m chatting with Iván Herrero Bartolomé, Chief Data Officer at Grupo Intercorp. Iván describes how he was prompted to write his new article in CDO Magazine, “CDOs, Let’s Get Out of Our Comfort Zone” as he recognized the importance of driving cultural change within organizations in order to optimize the use of data. Listen in to find out how Iván is leveraging the role of the analytics translator to drive this cultural shift, as well as the challenges and benefits he sees data leaders encounter as they move from tactical to strategic objectives. Iván also reveals the number one piece of advice he’d give CDOs who are struggling with adoption. 

Highlights / Skip to:

Iván explains what prompted him to write his new article, “CDOs, Let’s Get Out of Our Comfort Zone” (01:08) What Iván feels is necessary for data leaders to close the gap between data and the rest of the business and why (03:44) Iván dives into who he feels really owns delivery of value when taking on new data science and analytics projects (09:50) How Iván’s team went from managing technical projects that often didn’t make it to production to working on strategic projects that almost always make it to production (13:06) The framework Iván has developed to upskill technical and business roles to be effective data / analytics translators (16:32) The challenge Iván sees data leaders face as they move from setting and measuring tactical goals to moving towards strategic goals and initiatives (24:12) Iván explains how the C-Suite’s attitude impacts the cross-functional role of data & analytics leadership (28:55) The number one piece of advice Iván would give new CDO’s struggling with low adoption of their data products and solutions (31:45)

Quotes from Today’s Episode “We’re going to do all our best to ensure that [...] everything that is expected from us is done in the best possible way. But that’s not going to be enough. We need a sponsorship and we need someone accountable for the project and someone who will be pushing and enabling the use of the solution once we are gone. Because we cannot stay forever in every company.” – Iván Herrero Bartolomé (10:52)

“We are trying to upskill people from the business to become data translators, but that’s going to take time. Especially what we try to do is to take product owners and give them a high-level immersion on the state-of-the-art and the possibilities that data analytics bring to the table. But as we can’t rely on our companies having this kind of talent and these data translators, they are one of the profiles that we bring in for every project that we work on.” – Iván Herrero Bartolomé (13:51)

“There’s a lot to do, not just between data and analytics and the other areas of the company, but aligning the incentives of all the organization towards the same goals in a way that there’s no friction between the goals of the different areas, the people, [...]  and the final goals of the organization. – Iván Herrero Bartolomé (23:13) “Deciding which goals are you going to be co-responsible for, I think that is a sophisticated process that it’s not mastered by many companies nowadays. That probably is one of the main blockers keeping data analytics areas working far from their business counterparts” – Iván Herrero Bartolomé (26:05)

“When the C-suite looks at data and analytics, if they think these are just technical skills, then the data analytics team are just going to behave as technical people. And many, many data analytics teams are set up as part of the IT organization. So, I think it all begins somehow with how the C-suite of our companies look at us.” – Iván Herrero Bartolomé (28:55) “For me, [digital] means much more than the technical development of solutions; it should also be part of the transformation of the company, both in how companies develop relationships with their customers, but also inside how every process in the companies becomes more nimble and can react faster to the changes in the market.” – Iván Herrero Bartolomé (30:49) “When you feel that everyone else not doing what you think they should be doing, think twice about whether it is they who are not doing what they should be doing or if it’s something that you are not doing properly.” – Iván Herrero Bartolomé (31:45)

Links “CDOs, Let’s Get Out of Our Comfort Zone”: https://www.cdomagazine.tech/cdo_magazine/topics/opinion/cdos-lets-get-out-of-our-comfort-zone/article_dce87fce-2479-11ed-a0f4-03b95765b4dc.html LinkedIn: https://www.linkedin.com/in/ivan-herrero-bartolome/

Today I’m chatting with Katy Pusch, Senior Director of Product and Integration for Cox2M. Katy describes the lessons she’s learned around making sure that the “juice is always worth the squeeze” for new users to adopt data solutions into their workflow. She also explains the methodologies she’d recommend to data & analytics professionals to ensure their IOT and data products are widely adopted. Listen in to find out why this former analyst turned data product leader feels it’s crucial to focus on more than just delivering data or AI solutions, and how spending more time upfront performing qualitative research on users can wind up being more efficient in the long run than jumping straight into development.

Highlights/ Skip to:

What Katy does at Cox2M, and why the data product manager role is so hard to define (01:07) Defining the value of the data in workflows and how that’s approached at Cox2M (03:13) Who buys from Cox2M and the customer problems that Katy’s product solves (05:57) How Katy approaches the zero-to-one process of taking IOT sensor data and turning it into a customer experience that provides a valuable solution (08:00) What Katy feels best motivates the adoption of a new solution for users (13:21) Katy describes how she spends more time upfront before development to ensure she’s solving the right problems for users (16:13) Katy’s views on the importance of data science & analytics pros being able to communicate in the language of their audience (20:47) The differences Katy sees between designing data products for sophisticated data users vs a broader audience (24:13) The methods Katy uses to effectively perform qualitative research and her triangulation method to surface the real needs of end users (27:29) Katy’s views on the most valuable skills for future data product managers (35:24)

Quotes from Today’s Episode “I’ve had the opportunity to get a little bit closer to our customers than I was in the beginning parts of my tenure here at Cox2M. And it’s just like a SaaS product in the sense that the quality of your data is still dependent on your customers’ workflows and their ability to engage in workflows that supply accurate data. And it’s been a little bit enlightening to realize that the same is true for IoT.” – Katy Pusch (02:11)

“Providing insights to executives that are [simply] interesting is not really very impactful. You want to provide things that are actionable and that drive the business forward.” – Katy Pusch (4:43)

“So, there’s one side of it, which is [the] happy path: figure out a way to embed your product in the customer’s existing workflow. That’s where the most success happens. But in the situation we find ourselves in right now with [this IoT solution], we do have to ask them to change their workflow.”-- Katy Pusch (12:46)

“And the way to communicate [the insight to other stakeholders] is not with being more precise with your numbers [or adding] statistics. It’s just to communicate the output of your analysis more clearly to the person who needs to be able to make a decision.” -- Katy Pusch (23:15)

“You have to define ‘What decision is my user making on a repeated basis that is worth building something that it does automatically?’ And so, you say, ‘What are the questions that my user needs answers to on a repeated basis?’ … At its essence, you’re answering three or four questions for that user [that] have to be the most important [...] questions for your user to add value. And that can be a difficult thing to derive with confidence.” – Katy Pusch (25:55)

“The piece of workflow [on the IOT side] that’s really impactful there is we’re asking for an even higher degree of change management in that case because we’re asking them to attach this device to their vehicle, and then detach it at a different point in time and there’s a procedure in the solution to allow for that, but someone at the dealership has to engage in that process. So, there’s a change management in the workflow that the juice has to be worth the squeeze to encourage a customer to embark in that journey with you.” – Katy Pusch (12:08)

“Finding people in your organization who have the appetite to be cross-functionally educated, particularly in a data arena, is very important [to] help close some of those communication gaps.” – Katy Pusch (37:03)

Today I’m chatting with Vin Vashishta, Founder of V Squared. Vin believes that with methodical strategic planning, companies can prepare for continuous transformation by removing the silos that exist between leadership, data, AI, and product teams. How can these barriers be overcome, and what is the impact of doing so? Vin answers those questions and more, explaining why process disruption is necessary for long-term success and gives real-world examples of companies who are adopting these strategies.

Highlights/ Skip to:

What the AI ‘Last Mile’ Problem is (03:09) Why Vin sees so many businesses are reevaluating their offerings and realigning with their core business model (09:01) Why every company today is struggling to figure out how to bridge the gap between data, product, and business value (14:25) How the skillsets needed for success are evolving for data, product, and business leaders (14:40) Vin’s process when he’s helping a team with a data strategy, and what the end result looks like (21:53) Why digital transformation is dead, and how to reframe what business transformation means in today’s day and age (25:03) How Airbnb used data to inform their overall strategy to survive during a time of massive industry disruption, and how those strategies can be used by others as a preventative measure (29:03) Unpacking how a data strategy leader can work backward from a high-level business strategy to determining actionable steps and use cases for ML and analytics (32:52) Who (what roles) are ultimately responsible in an ideal strategy planning session? (34:41) How the C-Suite can bridge business & data strategy and the impact the world’s largest companies are seeing as a result (36:01)

Quotes from Today’s Episode “And when you have that [core business & technology strategy] disconnect, technology goes in one direction, what the business needs and what customers need sort of lives outside of the silo.” – Vin Vashishta (06:06)

“Why are we doing data and not just traditional software development? Why are we doing data science and not analytics? There has to be a justification because each one of these is more expensive than the last, each one is, you know, less certain.” – Vin Vashishta (10:36)

“[The right people to train] are smart about the technology, but have also lived with the users, have some domain expertise, and the interest in making a bigger impact. Let’s put them in strategy roles.” – Vin Vashishta (18:58) “You know, this is never going to end. Transformation is continuous. I don’t call it digital transformation anymore because that’s making you think that this thing is somehow a once-in-a-generation change. It’s not. It’s once every five years now.” – Vin Vashishta (25:03) “When do you want to have those [business] opportunities done by? When do you want to have those objectives completed by? Well, then that tells you how fast you have to transform if you want to use each one of these different technologies.” – Vin Vashishta (25:37) “You’ve got to disrupt the process. Strategy planning is not the same anymore. Look at how Amazon does it. ... They are destroying their competitors because their strategy planning process is both expert and data model-driven.” – Vin Vashishta (33:44) “And one of the critical things for CDOs to do is tell stories with data to the board. When they sit in and talk to the board. They need to tell those stories about how one data point hit this one use case and the company made $4 million.” – Vin Vashishta (39:33)

Links HumblePod: https://humblepod.com V Squared: https://datascience.vin LinkedIn: https://www.linkedin.com/in/vineetvashishta/ Twitter: https://twitter.com/v_vashishta YouTube channel: https://www.youtube.com/c/TheHighROIDataScientist Substack: https://vinvashishta.substack.com/

Today I’m sitting down with Jon Cooke, founder and CTO of Dataception, to learn his definition of a data product and his views on generating business value with your data products. In our conversation, Jon explains his philosophy on data products and where design and UX fit in. We also review his conceptual model for data products (which he calls the data product pyramid), and discuss how together, these concepts allow teams to ship working solutions faster that actually produce value. 

Highlights/ Skip to:

Jon’s definition of a data product (1:19)  Brian explains how UX research and design planning can and should influence data architecture —so that last mile solutions are useful and usable (9:47) The four characteristics of a data product in Jon’s model (16:16) The idea of products having a lifecycle with direct business/customer interaction/feedback (17:15) Understanding Jon’s data product pyramid (19:30) The challenges when customers/users don’t know what they want from data product teams - and who should be doing the work to surface requirements (24:44) Mitigating risk and the importance of having management buy-in when adopting a product-driven approach (33:23) Does the data product pyramid account for UX? (35:02) What needs to change in an org model that produces data products that aren’t delivering good last mile UXs (39:20)

Quotes from Today’s Episode “A data product is something that specifically solves a business problem, a piece of analytics, data use case, a pipeline, datasets, dashboard, that type that solves a business use case, and has a customer, and as a product lifecycle to it.” - Jon (2:15)

“I’m a fan of any definition that includes some type of deployment and use by some human being. That’s the end of the cycle, because the idea of a product is a good that has been made, theoretically, for sale.” - Brian (5:50)

“We don’t build a lot of stuff around cloud anymore. We just don’t build it from scratch. It’s like, you know, we don’t generate our own electricity, we don’t mill our own flour. You know, the cloud—there’s a bunch of composable services, which I basically pull together to build my application, whatever it is. We need to apply that thinking all the way through the stack, fundamentally.” - Jon (13:06)

“It’s not a data science problem, it’s not a business problem, it’s not a technology problem, it’s not a data engineering problem, it’s an everyone problem. And I advocate small, multidisciplinary teams, which have a business value person in it, have an SME, have a data scientist, have a data architect, have a data engineer, as a small pod that goes in and answer those questions.” - Jon (26:28)

“The idea is that you’re actually building the data products, which are the back-end, but you’re actually then also doing UX alongside that, you know? You’re doing it in tandem.” - Jon (37:36)

“Feasibility is one of the legs of the stools. There has to be market need, and your market just may be the sales team, but there needs to be some promise of value there that this person is really responsible for at the end of the day, is this data product going to create value or not?” - Brian (42:35)

“The thing about data products is sometimes you don’t know how feasible it is until you actually look at the data…You’ve got to do what we call data archaeology. You got to go and find the data, you got to brush it off, and you’re looking at and go, ‘Is it complete?’” - Jon (44:02)

Links Referenced: Dataception Data Product Pyramid Email: [email protected] LinkedIn: https://www.linkedin.com/in/jon-cooke-096bb0/

Today I’m chatting with Emilie Shario, a Data Strategist in Residence at Amplify Partners. Emilie thinks data teams should operate like product teams. But what led her to that conclusion, and how has she put the idea into practice? Emilie answers those questions and more, delving into what kind of pushback and hiccups someone can expect when switching from being data-driven to product-driven and sharing advice for data scientists and analytics leaders.

Highlights / Skip to:

Answering the question “whose job is it” (5:18) Understanding and solving problems instead of just building features people ask for (9:05) Emilie explains what Amplify Partners is and talks about her work experience and how it fuels her perspectives on data teams (11:04) Emilie and I talk about the definition of data product (13:00) Emilie talks about her approach to building and training a data team (14:40) We talk about UX designers and how they fit into Emilie’s data teams (18:40) Emilie talks about the book and blog “Storytelling with Data” (21:00) We discuss the push back you can expect when trying to switch a team from being data driven to being product driven (23:18) What hiccups can people expect when switching to a product driven model (30:36) Emilie’s advice for data scientists and and analyst leaders (35:50) Emilie explains what Locally Optimistic is (37:34)

Quotes from Today’s Episode “Our thesis is…we need to understand the problems we’re solving before we start building solutions, instead of just building the things people are asking for.” — Emilie (2:23)

“I’ve seen this approach of flipping the ask on its head—understanding the problem you’re trying to solve—work and be more successful at helping drive impact instead of just letting your data team fall into this widget builder service trap.” — Emilie (4:43)

“If your answer to any problem to me is, ‘That’s not my job,’ then I don’t want you working for me because that’s not what we’re here for. Your job is whatever the problem in front of you that needs to be solved.” — Emilie (7:14)

“I don’t care if you have all of the data in the world and the most talented machine learning engineers and you’ve got the ability to do the coolest new algorithm fancy thing. If it doesn’t drive business impact, it doesn’t matter.” — Emilie (7:52)

“Data is not just a thing that anyone can do. It’s not just about throwing numbers in a spreadsheet anymore. It’s about driving business impact. But part of how we drive business impact with data is making it accessible. And accessible isn’t just giving people the numbers, it’s also communicating with it effectively, and UX is a huge piece of how we do that.” — Emilie (19:57)

“There are no null choices in design. Someone is deciding what some other human—a customer, a client, an internal stakeholder—is going to use, whether it’s a React app, or a Power BI dashboard, or a spreadsheet dump, or whatever it is, right? There will be an experience that is created, whether it is intentionally created or not.” — Brian (20:28)

“People will think design is just putting in colors that match together, like, or spinning the color wheel and seeing what lands. You know, there’s so much more to it. And it is an expertise; it is a domain that you have to develop.” — Emilie (34:58)

Links Referenced: Blog post by Rifat Majumder storytellingwithdata.com Experiencing Data Episode 28 with Cole Nussbaumer Knaflic locallyoptimistic.com Twitter: @emilieschario

Today, I chat with Manav Misra, Chief Data and Analytics Officer at Regions Bank. I begin by asking Manav what it was like to come in and implement a user-focused mentality at Regions, driven by his experience in the software industry. Manav details his approach, which included developing a new data product partner role and using effective communication to gradually gain trust and cooperation from all the players on his team. 

Manav then talks about how, over time, he solidified a formal framework for his team to be trained to use this approach and how his hiring is influenced by a product orientation. We also discuss his definition of data product at Regions, which I find to be one of the best I’ve heard to date. Today, Region Bank’s data products are delivering tens of millions of dollars in additional revenue to the bank. Given those results, I also dig into the role of design and designers to better understand who is actually doing the designing of Regions’ data products to make them so successful. Later, I ask Manav what it’s like when designers and data professionals work on the same team and how UX and data visualization design are handled at the bank. 

Towards the end, Manav shares what he has learned from his time at Regions and what he would implement in a new organization if starting over. He also expounds on the importance of empowering his team to ask customers the right questions and how a true client/stakeholder partnership has led to Manav’s most successful data products.

Highlights / Skip to:

Brief history of decision science and how it influenced the way data science and analytics work has been done (and unfortunately still is in many orgs) (1:47) Manav’s philosophy and methods for changing the data science culture at Regions Bank to being product and user-driven (5:19) Manav talks about the size of his team and the data product role within the team as well as what he had to do to convince leadership to buy in to the necessity of the data product partner role (10:54) Quantifying and measuring the value of data products at Regions and some of his results (which include tens of millions of dollars in additional revenue) (13:05) What’s a “data product” at Regions? Manav shares his definition (13:44) Who does the designing of data products at Regions? (17:00) The challenges and benefits of having a team comprised of both designers and data scientists (20:10) Lessons Manav has learned from building his team and culture at Regions (23:09) How Manav coaches his team and gives them the confidence to ask the right questions (27:17) How true partnership has led to Manav’s most successful data products (31:46)

Quotes from Today’s Episode Re: how traditional, non-product oriented enterprises do data work: “As younger people come out of data science programs…that [old] culture is changing. The folks coming into this world now are looking to make an impact and then they want to see what this can do in the real world.” — Manav 

On the role of the Data Product Partner: “We brought in people that had both business knowledge as well as the technical knowledge, so with a combination of both they could talk to the ‘Internal customers,’ of our data products, but they could also talk to the data scientists and our developers and communicate in both directions in order to form that bridge between the two.” — Manav

“There are products that are delivering tens of millions of dollars in terms of additional revenue, or stopping fraud, or any of those kinds of things that the products are designed to address, they’re delivering and over-delivering on the business cases that we created.” — Manav 

“The way we define a data product is this: an end-to-end software solution to a problem that the business has. It leverages data and advanced analytics heavily in order to deliver that solution.” — Manav 

“The deployment and operationalization is simply part of the solution. They are not something that we do after; they’re something that we design in from the start of the solution.” — Brian 

“Design is a team sport. And even if you don’t have a titled designer doing the work, if someone is going to use the solution that you made, whether it’s a dashboard, or report, or an email, or notification, or an application, or whatever, there is a design, whether you put intention behind it or not.” — Brian

“As you look at interactive components in your data product, which are, you know, allowing people to ask questions and then get answers, you really have to think through what that interaction will look like, what’s the best way for them to get to the right answers and be able to use that in their decision-making.” — Manav 

“I have really instilled in my team that tools will come and go, technologies will come and go, [and so] you’ll have to have that mindset of constantly learning new things, being able to adapt and take on new ideas and incorporate them in how we do things.” — Manav

Links Regions Bank: https://www.regions.com/ LinkedIn: https://www.linkedin.com/in/manavmisra/

Today I chat with Chad Sanderson, Head of Product for Convoy’s data platform. I begin by having Chad explain why he calls himself a “data UX champion” and what inspired his interest in UX. Coming from a non-UX background, Chad explains how he came to develop a strategy for addressing the UX pain points at Convoy—a digital freight network. They “use technology to make freight more efficient, reducing costs for some of the nation’s largest brands, increasing earnings for carriers, and eliminating carbon emissions from our planet.” We also get into the metrics of success that Convoy uses to measure UX and why Chad is so heavily focused on user workflow when making the platform user-centered.

Later, Chad shares his definition of a data product, and how his experience with building software products has overlapped with data products. He also shares what he thinks is different about creating data products vs. traditional software products. Chad then explains Convoy’s approach to prototyping and the value of partnering with users in the design process. We wrap up by discussing how UX work gets accomplished on Chad’s team, given it doesn’t include any titled UX professionals. 

Highlights:

Chad explains how he became a data UX champion and what prompted him to care about UX (1:23) Chad talks about his strategy for beginning to address the UX issues at Convoy (4:42) How Convoy measures UX improvement (9:19) Chad talks about troubleshooting user workflows and it’s relevance to design (15:28) Chad explains what Convoy is and the makeup of his data platform team (21:00) What is a data product? Chad gives his definition and the similarities and differences between building software versus data products (23:21) Chad talks about using low fidelity work and prototypes to optimize solutions and resources in the long run (27:49) We talk about the value of partnering with users in the design process (30:37) Chad talks about the distribution of UX labor on his team (32:15)

Quotes from Today’s Episode  

Re: user research: "The best content that you get from people is when they are really thinking about what to say next; you sort of get into a free-flowing exchange of ideas. So it’s important to find the topic where someone can just talk at length without really filtering themselves. And I find a good place to start with that is to just talk about their problems. What are the painful things that you’ve experienced in data in the last month or in the last week?" - Chad 

Re: UX research: "I often recommend asking users to show you something they were working on recently, particularly when they were having a  problem accomplishing their goal. It’s a really good way to surface UX issues because the frustration is probably fresh." - Brian 

Re: user feedback, “One of the really great pieces of advice that I got is, if you’re getting a lot of negative feedback, this is actually a sign that people care. And if people care about what you’ve built, then it’s better than overbuilding from the beginning.” - Chad

“What we found [in our research around workflow], though, sometimes counterintuitively, is that the steps that are the easiest and simplest for a customer to do that I think most people would look at and say, ‘Okay, it’s pretty low ROI to invest in some automated solution or a product in this space,’ are sometimes the most important things that you can [address in your data product] because of the impacts that it has downstream.” - Chad 

Re: user feedback, “The amazing thing about building data products, and I guess any internal products is that 100% of your customers sit ten feet away from you. [...] When you can talk to 100% of [your users], you are truly going to understand [...] every single persona. And that is tremendously effective for creating compelling narratives about why we need to build a particular thing.” - Chad 

“If we can get people to really believe that this data product is going to solve the problem, then usually, we like to turn those people into advocates and evangelists within the company, and part of their job is to go out and convince other people about why this thing can solve the problem.” - Chad 

Links: Convoy: https://convoy.com/ Chad on LinkedIn: https://www.linkedin.com/in/chad-sanderson/ Chad’s Data Products newsletter: https://dataproducts.substack.com

Today I am bringing you a recording of a live interview I did at the TDWI Munich conference for data leaders, and this episode is a bit unique as I’m in the “guest” seat being interviewed by the VP of TDWI Europe, Christoph Kreutz. 

Christoph wanted me to explain the new workshop I was giving later that day, which focuses on helping leaders increase user adoption of data products through design. In our chat, I explained the three main areas I pulled out of my full 4-week seminar to create this new ½-day workshop as well as the hands-on practice that participants would be engaging in. The three focal points for the workshop were: measuring usability via usability studies, identifying the unarticulated needs of stakeholders and users, and sketching in low fidelity to avoid over committing to solutions that users won’t value. 

Christoph also asks about the format of the workshop, and I explain how I believe data leaders will best learn design by doing it. As such, the new workshop was designed to use small group activities, role-playing scenarios, peer review…and minimal lecture! After discussing the differences between the abbreviated workshop and my full 4-week seminar, we talk about my consulting and training business “Designing for Analytics,” and conclude with a fun conversation about music and my other career as a professional musician. 

In a hurry? Skip to: 

I summarize the new workshop version of “Designing Human-Centered Data Products” I was premiering at TDWI (4:18) We talk about the format of my workshop (7:32) Christoph and I discuss future opportunities for people to participate in this workshop (9:37) I explain the format of the main 8-week seminar versus the new half-day workshop  (10:14) We talk about one on one coaching (12:22) I discuss my background, including my formal music training and my other career as a professional musician (14:03)

Quotes from Today’s Episode “We spend a lot of time building outputs and infrastructure and pipelines and data engineering and generating stuff, but not always generating outcomes. Users only care about how does this make my life better, my job better, my job easier? How do I look better? How do I get a promotion? How do I make the company more money? Whatever those goals are. And there’s a gap there sometimes, between the things that we ship and delivering these outcomes.” (4:36) “In order to run a usability study on a data product, you have to come up with some type of learning goals and some kind of scenarios that you’re going to give to a user and ask them to go show me how you would do x using the data thing that we built for you.” (5:54) “The reality is most data users and stakeholders aren’t designers and they’re not thinking about the user’s workflow and how a solution fits into their job. They don’t have that context. So, how do we get the really important requirements out of a user or stakeholder’s head? I teach techniques from qualitative UX interviewing, sales, and even hostage negotiation to get unarticulated needs out of people’s head.” (6:41) “How do we work in low fidelity to get data leaders on the same page with a stakeholder or a user? How do we design with users instead of for them? Because most of the time, when we communicate visually, it starts to click (or you’ll know it’s not clicking!)” (7:05) “There’s no right or wrong [in the workshop]. [The workshop] is really about the practice of using these design methods and not the final output that comes out of the end of it.” (8:14) “You learn design by doing design so I really like to get data people going by trying it instead of talking about trying it. More design doing and less design thinking!” (8:40) “The tricky thing [for most of my training clients], [and perhaps this is true with any type of adult education] is, ‘Yeah, I get the concept of what Brian’s talking about, but, how do I apply these design techniques to my situation? I work in this really weird domain, or on this particularly hard data space.’ Working on an exercise or real project, together, in small groups, is how I like start to make the conceptual idea of design into a tangible tool for data leaders..” (12:26)

Links Brian’s training seminar

Today I sit down with Vijay Yadav, head of the data science team at Merck Manufacturing Division. Vijay begins by relating his own path to adopting a data product and UX-driven approach to applied data science, andour chat quickly turns to the ever-present challenge of user adoption. Vijay discusses his process of designing data products with customers, as well as the impact that building user trust has on delivering business value. We go on to talk about what metrics can be used to quantify adoption and downstream value, and then Vijay discusses the financial impact he has seen at Merck using this user-oriented perspective. While we didn’t see eye to eye on everything, Vijay was able to show how focusing on the last mile UX has had a multi-million dollar impact on Merck. The conversation concludes with Vijay’s words of advice for other data science directors looking to get started with a design and user-centered approach to building data products that achieve adoption and have measurable impact.

In our chat, we covered Vijay’s design process, metrics, business value, and more: 

Vijay shares how he came to approach data science with a data product management approach and how UX fits in (1:52) We discuss overcoming the challenge of user adoption by understanding user thinking and behavior (6:00) We talk about the potential problems and solutions when users self-diagnose their technology needs (10:23) Vijay delves into what his process of designing with a customer looks like (17:36) We discuss the impact “solving on the human level” has on delivering real world benefits and building user trust (21:57) Vijay talks about measuring user adoption and quantifying downstream value—and Brian discusses his concerns about tool usage metrics as means of doing this (25:35) Brian and Vijay discuss the multi-million dollar financial and business impact Vijay has seen at Merck using a more UX  driven approach to data product development (31:45) Vijay shares insight on what steps a head of data science  might wish to take to get started implementing a data product and UX approach to creating ML and analytics applications that actually get used  (36:46)

Quotes from Today’s Episode “They will adopt your solution if you are giving them everything they need so they don’t have to go look for a workaround.” - Vijay (4:22)

“It’s really important that you not only capture the requirements, you capture the thinking of the user, how the user will behave if they see a certain way, how they will navigate, things of that nature.” - Vijay (7:48)

“When you’re developing a data product, you want to be making sure that you’re taking the holistic view of the problem that can be solved, and the different group of people that we need to address. And, you engage them, right?” - Vijay (8:52)

“When you’re designing in low fidelity, it allows you to design with users because you don’t spend all this time building the wrong thing upfront, at which point it’s really expensive in time and money to go and change it.” - Brian (17:11)

"People are the ones who make things happen, right? You have all the technology, everything else looks good, you have the data, but the people are the ones who are going to make things happen.” - Vijay (38:47)

“You want to make sure that you [have] a strong team and motivated team to deliver. And the human spirit is something, you cannot believe how stretchable it is. If the people are motivated, [and even if] you have less resources and less technology, they will still achieve [your goals].” - Vijay (42:41)

“You’re trying to minimize any type of imposition on [the user], and make it obvious why your data product  is better—without disruption. That’s really the key to the adoption piece: showing how it is going to be better for them in a way they can feel and perceive. Because if they don’t feel it, then it’s just another hoop to jump through, right?” - Brian (43:56)

Resources and Links:  LinkedIn: https://www.linkedin.com/in/vijyadav/

Episode Description In one of my past memos to my list subscribers, I addressed some questions about agile and data products. Today, I expound on each of these and share some observations from my consulting work. In some enterprise orgs, mostly outside of the software industry, agile is still new and perceived as a panacea. In reality, it can just become a factory for shipping features and outputs faster–with positive outcomes and business value being mostly absent. To increase the adoption of enterprise data products that have humans in the loop, it’s great to have agility in mind, but poor technology shipped faster isn’t going to serve your customers any better than what you’re doing now. 

Here are the 10 reflections I’ll dive into on this episode: 

You can't project manage your way out of a [data] product problem. 

The more you try to deploy agile at scale, take the trainings, and hire special "agilists", the more you're going to tend to measure success by how well you followed the Agile process.

Agile is great for software engineering, but nobody really wants "software engineering" given to them. They do care about the perceived reality of your data product.

Run from anyone that tells you that you shouldn't ever do any design, user research, or UX work "up front" because "that is waterfall." 

Everybody else is also doing modified scrum (or modified _).

Marty Cagan talks about this a lot, but in short: while the PM (product managers) may own the backlog and priorities, what’s more important is that these PMs “own the problem” space as opposed to owning features or being solution-centered. 

Before Agile can thrive, you will need strong senior leadership buy-in if you're going to do outcome-driven data product work.

There's a huge promise in the word "agile." You've been warned. 

If you don't have a plan for how you'll do discovery work, defining clear problem sets and success metrics, and understanding customers feelings, pains, needs, and wants, and the like, Agile won't deliver much improvement for data products (probably).

Getting comfortable with shipping half-right, half-quality, half-done is hard. 

Quotes from Today’s Episode  “You can get lost in following the process and thinking that as long as we do that, we’re going to end up with a great data product at the end.” - Brian (3:16) “The other way to define clear success criteria for data products and hold yourself accountable to those on the user and business side is to really understand what does a positive outcome look like? How would we measure it?” - Brian (5:26) “The most important thing is to know that the user experience is the perceived reality of the technology that you built. Their experience is the only reality that matters.” - Brian (9:22) “Do the right amount of planning work upfront, have a strategy in place, make sure the team understands it collectively, and then you can do the engineering using agile.” - Brian (18:15) “If you don’t have a plan for how you’ll do discovery work, defining clear problem sets and success metrics, and understanding customers’ feelings, pains, needs, wants, and all of that, then agile will not deliver increased adoption of your data products. - Brian (36:07)

Links: designingforanalytics.com: https://designingforanalytics.com designingforanalytics.com/list: https://designingforanalytics.com/list

Today I’m talking about how to measure data product value from a user experience and business lens, and where leaders sometimes get it wrong. Today’s first question was asked at my recent talk at the Data Summit conference where an attendee asked how UX design fits into agile data product development. Additionally, I recently had a subscriber to my Insights mailing list ask about how to measure adoption, utilization, and satisfaction of data products. So, we’ll jump into that juicy topic as well.

Answering these inquiries also got me on a related tangent about the UX challenges associated with abstracting your platform to support multiple, but often theoretical, user needs—and the importance of collaboration to ensure your whole team is operating from the same set of assumptions or definitions about success. I conclude the episode with the concept of “game framing” as a way to conceptualize these ideas at a high level. 

Key topics and cues in this episode include: 

An overview of the questions I received (:45) Measuring change once you’ve established a benchmark (7:45)  The challenges of working in abstractions (abstracting your platform to facilitate theoretical future user needs) (10:48) The value of having shared definitions and understanding the needs of different stakeholders/users/customers (14:36) The importance of starting from the “last mile” (19:59) The difference between success metrics and progress metrics (24:31) How measuring feelings can be critical to measuring success (29:27) “Game framing” as a way to understand tracking progress and success (31:22)

Quotes from Today’s Episode “Once you’ve got your benchmark in place for a data product, it’s going to be much easier to measure what the change is because you’ll know where you’re starting from.” - Brian (7:45)

“When you’re deploying technology that’s supposed to improve people’s lives so that you can get some promise of business value downstream, this is not a generic exercise. You have to go out and do the work to understand the status quo and what the pain is right now from the user's perspective.” - Brian (8:46)

“That user perspective—perception even—is all that matters if you want to get to business value. The user experience is the perceived quality, usability, and utility of the data product.” - Brian (13:07)

“A data product leader’s job should be to own the problem and not just the delivery of data product features, applications or technology outputs. ” - Brian (26:13)

“What are we keeping score of? Different stakeholders are playing different games so it’s really important for the data product team not to impose their scoring system (definition of success) onto the customers, or the users, or the stakeholders.” - Brian (32:05)

“We always want to abstract once we have a really good understanding of what people do, as it’s easier to create more user-centered abstractions that will actually answer real data questions later on. ” - Brian (33:34)

Links https://designingforanalytics.com/community

Today I talked with João Critis from Oi. Oi is a Brazilian telecommunications company that is a pioneer in convergent broadband services, pay TV, and local and long-distance voice transmission. They operate the largest fiber optics network in Brazil which reaches remote areas to promote digital inclusion of the population. João manages a design team at Oi that is responsible for the front end of data products including dashboards, reports, and all things data visualization. 

We begin by discussing João’s role leading a team of data designers. João then explains what data products actually are, and who makes up his team’s users and customers. João goes on to discuss user adoption challenges at Oi and the methods they use to uncover what users need in the last mile. He then explains the specific challenges his team has faced, particularly with middle management, and how his team builds credibility with senior leadership. In conclusion, João reflects on the value of empathy in the design process. 

In this episode, João shares:  

A data product  (4:48) The research process used by his data teams to build journey maps for clients (7:31) User adoption challenges for Oi (15:27) His answer to the question “how do you decide which mouths to feed?” (16:56) The unique challenges of middle management in delivering useful data products (20:33) The importance of empathy in innovation (25:23) What data scientists need to learn about design and vice versa (27:55)

Quotes from Today’s Episode

“We put the final user in the center of our process. We [conduct] workshops involving co-creation and prototyping, and we test how people work with data.” - João (8:22)

"My first responsibility here is value generation. So, if you have to take two or three steps back, another brainstorm, rethink, and rebuild something that works…. [well], this is very common for us.” - João (19:28)

“If you don’t make an impact on the individuals, you’re not going to make an impact on the business. Because as you said, if they don’t use any of the outputs we make, then they really aren’t solutions and no value is created. - Brian (25:07)

“It’s really important to do what we call primary research where you’re directly interfacing as much as possible with the horse’s mouth, no third parties, no second parties. You’ve really got to develop that empathy.” - Brian (25:23)

“When we are designing some system or screen or other digital artifact, [we have to understand] this is not only digital, but a product. We have to understand people, how people interact with systems, with computers, and how people interact with visual presentations.” - João (28:16)

Links Oi: https://www.oi.com.br/ LinkedIn: https://www.linkedin.com/in/critis/ Instagram: https://www.instagram.com/critis/

Michelle Carney began her career in the worlds of neuroscience and machine learning where she worked on the original Python Notebooks. As she fine-tuned ML models and started to notice discrepancies in the human experience of using these models, her interest turned towards UX. Michelle discusses how her work today as a UX researcher at Google impacts her work with teams leveraging ML in their applications. She explains how her interest in the crossover of ML and UX led her to start MLUX, a collection of meet-up events where professionals from both data science and design can connect and share methods and ideas. MLUX now hosts meet-ups in several locations as well as virtually. 

Our conversation begins with Michelle’s explanation of how she teaches data scientists to integrate UX into the development of their products. As a teacher, Michelle utilizes the IDEO Design Kit with her students at the Stanford School of Design (d.school). In her teaching she shares some of the unlearning that data scientists need to do when trying to approach their work with a UX perspective in her course, Designing Machine Learning.

Finally, we also discussed what UX designers need to know about designing for ML/AI. Michelle also talks about how model interpretability is a facet of UX design and why model accuracy isn’t always the most important element of a ML application. Michelle ends the conversation with an emphasis on the need for more interdisciplinary voices in the fields of ML and AI. 

Skip to a topic here:

Michelle talks about what drove her career shift from machine learning and neuroscience to user experience (1:15) Michelle explains what MLUX is (4:40) How to get ML teams on board with the importance of user experience (6:54) Michelle discusses the “unlearning” data scientists might have to do as they reconsider ML from a UX perspective (9:15) Brian and Michelle talk about the importance of considering the UX from the beginning of model development  (10:45) Michelle expounds on different ways to measure the effectiveness of user experience (15:10) Brian and Michelle talk about what is driving the increase in the need for designers on ML teams (19:59) Michelle explains the role of design around model interpretability and explainability (24:44)

Quotes from Today’s Episode “The first step to business value is the hurdle of adoption. A user has to be willing to try—and care—before you ever will get to business value.” - Brian O’Neill (13:01)

“There’s so much talk about business value and there’s very little talk about adoption. I think providing value to the end-user is the gateway to getting any business value. If you’re building anything that has a human in the loop that’s not fully automated, you can’t get to business value if you don’t get through the first gate of adoption.” - Brian O’Neill (13:17)

“I think that designers who are able to design for ambiguity are going to be the ones that tackle a lot of this AI and ML stuff.” - Michelle Carney (19:43)

“That’s something that we have to think about with our ML models. We’re coming into this user’s life where there’s a lot of other things going on and our model is not their top priority, so we should design it so that it fits into their ecosystem.” - Michelle Carney (3:27)

“If we aren’t thinking about privacy and ethics and explainability and usability from the beginning, then it’s not going to be embedded into our products. If we just treat usability of our ML models as a checkbox, then it just plays the role of a compliance function.” - Michelle Carney (11:52)

“I don’t think you need to know ML or machine learning in order to design for ML and machine learning. You don’t need to understand how to build a model, you need to understand what the model does. You need to understand what the inputs and the outputs are.” - Michelle Carney (18:45)

Links Twitter @mluxmeetup: https://twitter.com/mluxmeetup MLUX LinkedIn: https://www.linkedin.com/company/mlux/ MLUX YouTube channel: https://bit.ly/mluxyoutube Twitter @michelleRcarney: https://twitter.com/michelleRcarney IDEO Design Kit - https://tinyurl.com/2p984znh 

Dashboards are at the forefront of today’s episode, and so I will be responding to some reader questions who wrote in to one of my weekly mailing list missives about this topic. I’ve not talked much about dashboards despite their frequent appearance in data product UIs, and in this episode, I’ll explain why. Here are some of the key points and the original questions asked in this episode:

My introduction to dashboards (00:00) Some overall thoughts on dashboards (02:50) What the risk is to the user if the insights are wrong or misinterpreted (4:56) Your data outputs create an experience, whether intentional or not (07:13) John asks: How do we figure out exactly what the jobs are that the dashboard user is trying to do? Are they building next year's budget or looking for broken widgets?  What does this user value today? Is a low resource utilization percentage something to be celebrated or avoided for this dashboard user today?  (13:05) Value is not intrinsically in the dashboard (18:47) Mareike asks: How do we provide Information in a way that people are able to act upon the presented Information?  How do we translate the presented Information into action? What can we learn about user expectation management when designing dashboard/analytics solutions? (22:00) The change towards predictive and prescriptive analytics (24:30) The upfront work that needs to get done before the technology is in front of the user (30:20) James asks: How can we get people to focus less on the assumption-laden and often restrictive term "dashboard", and instead worry about designing solutions focused on outcomes for particular personas and workflows that happen to have some or all of the typical ingredients associated with the catch-all term "dashboards?” (33:30) Stop measuring the creation of outputs and focus on the user workflows and the jobs to be done (37:00) The data product manager shouldn’t just be focused on deliverables (42:28)

Quotes from Today’s Episode “The term dashboards is almost meaningless today, it seems to mean almost any home default screen in a data product. It also can just mean a report. For others, it means an entire monitoring tool, for some, it means the summary of a bunch of data that lives in some other reports. The terms are all over the place.”- Brian (@rhythmspice) (01:36)

“The big idea here that I really want leaders to be thinking about here is you need to get your teams focused on workflows—sometimes called jobs to be done—and the downstream decisions that users want to make with machine-learning or analytical insights. ” - Brian (@rhythmspice) (06:12)

“This idea of human-centered design and user experience is really about trying to fit the technology into their world, from their perspective as opposed to building something in isolation where we then try to get them to adopt our thing.  This may be out of phase with the way people like to do their work and may lead to a much higher barrier to adoption.” - Brian (@rhythmspice) (14:30)

“Leaders who want their data science and analytics efforts to show value really need to understand that value is not intrinsically in the dashboard or the model or the engineering or the analysis.” - Brian (@rhythmspice) (18:45)

“There's a whole bunch of plumbing that needs to be done, and it’s really difficult. The tool that we end up generating in those situations tends to be a tool that’s modeled around the data and not modeled around [the customers] mental model of this space, the customer purchase space, the marketing spend space, the sales conversion, or propensity-to-buy space.” - Brian (@rhythmspice) (27:48)

“Data product managers should be these problem owners, if there has to be a single entity for this. When we’re talking about different initiatives in the enterprise or for a commercial software company, it’s really sits at this product management function.”  - Brian (@rhythmspice) (34:42)

“It’s really important that [data product managers] are not just focused on deliverables; they need to really be the ones that summarize the problem space for the entire team, and help define a strategy with the entire team that clarifies the direction the team is going in. They are not a project manager; they are someone responsible for delivering value.” - Brian (@rhythmspice) (42:23)

Links Referenced:

Mailing List: https://designingforanalytics.com/list CED UX Framework for Advanced Analytics:Original Article: https://designingforanalytics.com/ced Podcast/Audio Episode: https://designingforanalytics.com/resources/episodes/086-ced-my-ux-framework-for-designing-analytics-tools-that-drive-decision-making/ 

My LinkedIn Live about Measuring the Usability of Data Products: https://www.linkedin.com/video/event/urn:li:ugcPost:6911800738209800192/ Work With Me / My Services: https://designingforanalytics.com/services

Mike Oren, Head of Design Research at Klaviyo, joins today’s episode to discuss how we do UX research for data products—and why qualitative research matters. Mike and I recently met in Lou Rosenfeld’s Quant vs. Qual group, which is for people interested in both qualitative and quantitative methods for conducting user research. Mike goes into the details on how Klaviyo and his teams are identifying what customers need through research, how they use data to get to that point, what data scientists and non-UX professionals need to know about conducting UX research, and some tips for getting started quickly. He also explains how Klaviyo’s data scientists—not just the UX team—are directly involved in talking to users to develop an understanding of their problem space.

Klaviyo is a communications platform that allows customers to personalize email and text messages powered by data. In this episode, Mike talks about how to ask research questions to get at what customers actually need. Mikes also offers some excellent “getting started” techniques for conducting interviews (qualitative research), the kinds of things to be aware of and avoid when interviewing users, and some examples of the types of findings you might learn. He also gives us some examples of how these research insights become features or solutions in the product, and how they interpret whether their design choices are actually useful and usable once a customer interacts with them. I really enjoyed Mike’s take on designing data-driven solutions, his ideas on data literacy (for both designers, and users), and hearing about the types of dinner conversations he has with his wife who is an economist ;-) . Check out our conversation for Mike’s take on the relevance of research for data products and user experience. 

In this episode, we cover:

Using “small data” such as qualitative user feedback  to improve UX and data products—and the #1 way qualitative data beats quantitative data  (01:45) Mike explains what Klaviyo is, and gives an example of how they use qualitative information to inform the design of this communications product  (03:38) Mike discusses Klaviyo data scientists doing research and their methods for conducting research with their customers (09:45) Mike’s tips on what to avoid when you’re conducting research so you get objective, useful feedback on your data product  (12:45) Why dashboards are Mike’s pet peeve (17:45) Mike’s thoughts about data illiteracy, how much design needs to accommodate it, and how design can help with it (22:36) How Mike conveys the research to other teams that help mitigate risk  (32:00) Life with an economist! (36:00) What the UX and design community needs to know about data (38:30)

Quotes from Today’s Episode “I actually tell my team never to do any qualitative research around preferences…Preferences are usually something that you’re not going to get a reliable enough sample from if you’re just getting it qualitatively, just because preferences do tend to vary a lot from individual to individual; there’s lots of other factors. ”- Mike (@mikeoren) (03:05)

“[Discussing a product design choice influenced by research findings]: Three options gave [the customers a] feeling of more control. In terms of what actual options they wanted, two options was really the most practical, but the thing was that we weren’t really answering the main question that they had, which was what was going to happen with their data if they restarted the test with a new algorithm that was being used. That was something that we wouldn’t have been able to identify if we were only looking at the quantitative data if we were only serving them; we had to get them to voice through their concerns about it.” - Mike (@mikeoren) (07:00)

“When people create dashboards, they stick everything on there. If a stakeholder within the organization asked for a piece of data, that goes on the dashboard. If one time a piece of information was needed with other pieces of information that are already on the dashboard, that now gets added to the dashboard. And so you end up with dashboards that just have all these different things on them…you no longer have a clear line of signal.” - Mike (@mikeoren) (17:50)

“Part of the experience we need to talk about when we talk about experiencing data is that the experience can happen in more additional vehicles besides a dashboard: A text message, an email notification, there’s other ways to experience the effects of good, intelligent data product work. Pushing the right information at the right time instead of all the information all the time.” - Brian (@rhythmspice) (20:00)

“[Data illiteracy is] everyone’s problem. Depending upon what type of data we’re talking about, and what that product is doing, if an organization is truly trying to make data-driven decisions, but then they haven’t trained their leaders to understand the data in the right way, then they’re not actually making data-driven decisions; they’re really making instinctual decisions, or they’re pretending that they’re using the data.” - Mike (@mikeoren)(23:50)

“Sometimes statistical significance doesn’t matter to your end-users. More often than not organizations aren’t looking for 95% significance. Usually, 80% is actually good enough for most business decisions. Depending upon the cost of getting a high level of confidence, they might not even really value that additional 15% significance.” - Mike (@mikeoren) (31:06)

“In order to effectively make software easier for people to use, to make it useful to people, [designers have] to learn a minimum amount about that medium in order to start crafting those different pieces of the experience that we’re preparing to provide value to people. We’re running into the same thing with data applications where it’s not enough to just know that numbers exist and those are a thing, or to know some graphic primitives of line charts, bar charts, et cetera. As a designer, we have to understand that medium well enough that we can have a conversation with our partners on the data science team.” - Mike (@mikeoren) (39:30)

For Danielle Crop, the Chief Data Officer of Albertsons, to draw distinctions between “digital” and “data” only limits the ability of an organization to create useful products. One of the reasons I asked Danielle on the show is due to her background as a CDO and former SVP of digital at AMEX, where she also managed  product and design groups. My theory is that data leaders who have been exposed to the worlds of software product and UX design are prone to approach their data product work differently, and so that’s what we dug into this episode.   It didn’t take long for Danielle to share how she pushes her data science team to collaborate with business product managers for a “cross-functional, collaborative” end result. This also means getting the team to understand what their models are personalizing, and how customers experience the data products they use. In short, for her, it is about getting the data team to focus on “outcomes” vs “outputs.”

Scaling some of the data science and ML modeling work at Albertsons is a big challenge, and we talked about one of the big use cases she is trying to enable for customers, as well as one “real-life” non-digital experience that her team’s data science efforts are behind.

The big takeaway for me here was hearing how a CDO like Danielle is really putting customer experience and the company’s brand at the center of their data product work, as opposed solely focusing on ML model development, dashboard/BI creation, and seeing data as a raw ingredient that lives in a vacuum isolated from people.  

In this episode, we cover:

Danielle’s take on the “D” in CDO: is the distinction between “digital” and “data” even relevant, especially for a food and drug retailer? (01:25) The role of data product management and design in her org and how UX (i.e. shopper experience) is influenced by and considered in her team’s data science work (06:05) How Danielle’s team thinks about “customers” particularly in the context of internal stakeholders vs. grocery shoppers  (10:20) Danielle’s current and future plans for bringing her data team into stores to better understand shoppers and customers (11:11) How Danielle’s data team works with the digital shopper experience team (12:02)  “Outputs” versus “Outcomes”  for product managers, data science teams, and data products (16:30) Building customer loyalty, in-store personalization, and long term brand interaction with data science at Albertsons (20:40) How Danielle and her team at Albertsons measure the success of their data products (24:04) Finding the problems, building the solutions, and connecting the data to the non-technical side of the company (29:11)

Quotes from Today’s Episode “Data always comes from somewhere, right? It always has a source. And in our modern world, most of that source is some sort of digital software. So, to distinguish your data from its source is not very smart as a data scientist. You need to understand your data very well, where it came from, how it was developed, and software is a massive source of data. [As a CDO], I think it’s not important to distinguish between [data and digital]. It is important to distinguish between roles and responsibilities, you need different skills for these different areas, but to create an artificial silo between them doesn’t make a whole lot of sense to me.”- Danielle  (03:00)

“Product managers need to understand what the customer wants, what the business needs, how to pass that along to data scientists and data scientists, and to understand how that’s affecting business outcomes. That’s how I see this all working. And it depends on what type of models they’re customizing and building, right? Are they building personalization models that are going to be a digital asset? Are they building automation models that will go directly to some sort of operational activity in the store? What are they trying to solve?” - Danielle (06:30)

“In a company that sells products—groceries—to individuals, personalization is a huge opportunity. How do we make that experience, both in-digital and in-store, more relevant to the customer, more sticky and build loyalty with those customers? That’s the core problem, but underneath that is you got to build a lot of models that help personalize that experience. When you start talking about building a lot of different models, you need scale.”  - Danielle (9:24)

“[Customer interaction in the store] is a true big data problem, right, because you need to use the WiFi devices, et cetera. that you have in store that are pinging the devices at all times, and it’s a massive amount of data. Trying to weed through that and find the important signals that help us to actually drive that type of personalized experience is challenging. No one’s gotten there yet. I hope that we’ll be the first.” -  Danielle (19:50)

“I can imagine a checkout clerk who doesn’t want to talk to the customer, despite a data-driven suggestion appearing on the clerk’s monitor as to how to personalize a given customer interaction. The recommendation suggested to the clerk may be ‘accurate from a data science point of view, but if the clerk doesn’t actually act on it, then the data product didn’t provide any value. When I train people in my seminar, I try to get them thinking about that last mile. It may not be data science work, and maybe you have a big enough org where that clerk/customer experience is someone else’s responsibility, but being aware that this is a fault point and having a cross-team perspective is key.” - Brian @rhythmspice (24:50)

“We’re going through a moment in time in which trust in data is shaky. What I’d like people to understand and know on a broader philosophical level, is that in order to be able to understand data and use it to make decisions, you have to know its source. You have to understand its source. You have to understand the incentives around that source of data….you have to look at the data from the perspective of what it means and what the incentives were for creating it, and then analyze it, and then give an output. And fortunately, most statisticians, most data scientists, most people in most fields that I know, are incredibly motivated to be ethical and accurate in the information that they’re putting out.” - Danielle (34:15)

Today, I’m flying solo in order to introduce you to CED: my three-part UX framework for designing your ML / predictive / prescriptive analytics UI around trust, engagement, and indispensability. Why this, why now? I have had several people tell me that this has been incredibly helpful to them in designing useful, usable analytics tools and decision support applications. 

I have written about the CED framework before at the following link:

https://designingforanalytics.com/ced

There you will find an example of the framework put into a real-world context. In this episode, I wanted to add some extra color to what is discussed in the article. If you’re an individual contributor, the best part is that you don’t have to be a professional designer to begin applying this to your own data products. And for leaders of teams, you can use the ideas in CED as a “checklist” when trying to audit your team’s solutions in the design phase—before it’s too late or expensive to make meaningful changes to the solutions. 

CED is definitely easier to implement if you understand the basics of human-centered design, including research, problem finding and definition, journey mapping, consulting, and facilitation etc. If you need a step-by-step method to develop these foundational skills, my training program, Designing Human-Centered Data Products, might help. It comes in two formats: a Self-Guided Video Course and a bi-annual Instructor-Led Seminar.

Quotes from Today’s Episode “‘How do we visualize the data?’ is the wrong starting question for designing a useful decision support application. That makes all kinds of assumptions that we have the right information, that we know what the users' goals and downstream decisions are, and we know how our solution will make a positive change in the customer or users’ life.”- Brian (@rhythmspice) (02:07)

“The CED is a UX framework for designing analytics tools that drive decision-making. Three letters, three parts: Conclusions; C, Evidence: E, and Data: D. The tough pill for some technical leaders to swallow is that the application, tool or product they are making may need to present what I call a ‘conclusion’—or if you prefer, an ‘opinion.’ Why? Because many users do not want an ‘exploratory’ tool—even when they say they do. They often need an insight to start with, before exploration time  becomes valuable.” - Brian (@rhythmspice) (04:00)

“CED requires you to do customer and user research to understand what the meaningful changes, insights, and things that people want or need actually are. Well designed ‘Conclusions’—when experienced in an analytics tool using the CED framework—often manifest themselves as insights such as unexpected changes, confirmation of expected changes, meaningful change versus meaningful benchmarks, scoring how KPIs track to predefined and meaningful ranges, actionable recommendations, and next best actions. Sometimes these Conclusions are best experienced as charts and visualizations, but not always—and this is why visualizing the data rarely is the right place to begin designing the UX.” - Brian (@rhythmspice) (08:54)

“If I see another analytics tool that promises ‘actionable insights’ but is primarily experienced as a collection of gigantic data tables with 10, 20, or 30+ columns of data to parse, your design is almost certainly going to frustrate, if not alienate, your users. Not because all table UIs are bad, but because you’ve put a gigantic tool-time tax on the user, forcing them to derive what the meaningful conclusions should be.”   - Brian (@rhythmspice) (20:20)

Why design matters in data products is a question that, at first glance, may not be easily answered for some until they see users try to use ML models and analytics to make decisions. For Bill Báez, a data scientist and VP of Strategy at Ascend Innovations, realizing that design and UX matters in this context was a realization that grew over the course of a few years. Bill’s origins in the Air Force, and his transition to Ascend Innovations, instilled lessons about the importance of using design thinking with both clients and users. 

After observing solutions built in total isolation with zero empathy and knowledge of how they were being perceived in the wild, Bill realized the critical need to bring developers “upstairs” to actually observe the people using the solutions that were being built. 

Currently, Ascend Innovation’s consulting is primarily rooted in healthcare and community services, and in this episode, Bill provides some real-world examples where their machine learning and analytics solutions were informed by approaching the problems from a human-centered design perspective. Bill also dives in to where he is on his journey to integrate his UX and data science teams at Ascend so they can create better value for their clients and their client’s constituents. 

Highlights in this episode include:

What caused Bill to notice design for the first time and its importance in data products (03:12) Bridging the gap between data science, UX, and the client’s needs at Ascend (08:07) How to deal with the “presenting problem” and working with feedback (16:00) Bill’s advice for getting designers, UX, and clients on the same page based on his experience to date (23:56) How Bill provides unity for his UX and data science teams   (32:40) The effects of UX in medicine (41:00)

Quotes from Today’s Episode “My journey into Design Thinking started in earnest when I started at Ascend, but I didn’t really have the terminology to use. For example, Design Thinking and UX were actually terms I was not personally aware of until last summer. But now that I know and have been exposed to it and have learned more about it, I realize I’ve been doing a lot of that type of work in earnest since 2018. - Bill (03:37)

“Ascend Innovations has always been product-focused, although again, services is our main line of business. As we started hiring a more dedicated UX team, people who’ve been doing this for their whole career, it really helped me to understand what I had experienced prior to coming to Ascend. Part of the time I was here at Ascend that UX framework and that Design Thinking lens, it really brings a lot more firepower to what data science is trying to achieve at the end of the day.” - Bill (08:29) “Clients were surprised that we were asking such rudimentary questions.  They’ll say ‘Well, we’ve already talked about that,’ or, ‘It should be obvious.’ or ‘Well, why are you asking me such a simple question?’ And we had to explain to them that we wanted to start at the bottom to move to the top. We don’t want to start somewhere midway and get the top. We want to make sure that we are all in alignment with what we’re trying to do, so we want to establish that baseline of understanding. So, we’re going to start off asking very simple questions and work our way up from there...” - Bill (21:09)

“We’re building a thing, but the thing only has value if it creates a change in the world. The world being, in the mind of the stakeholder, in the minds of the users, maybe some third parties that are affected by that stuff, but it’s the change that matters. So what is the better state we want in the future for our client or for our customers and users? That’s the thing we’re trying to create. Not the thing; the change from the thing is what we want, and getting to that is the hard part.” - Brian (@rhythmspice) (26:33)

“This is a gift that you’re giving to [stakeholders] to save time, to save money, to avoid building something that will never get used and will not provide value to them. You do need to push back against this and if they say no, that’s fine. Paint the picture of the risk, though, by not doing design. It’s very easy for us to build a ML model. It’s hard for us to build a model that someone will actually use to make the world better. And in this case, it’s healthcare or support, intervention support for addicts. “Do you really want a model, or do you want an improvement in the lives of these addicts? That’s ultimately where we’re going with this, and if we don’t do this, the risk of us pushing out an output that doesn’t get used is high. So, design is a gift, not a tax...” - Brian (@rhythmspice) (34:34)

“I’d say to anybody out there right now who’s currently working on data science efforts: the sooner you get your people comfortable with the idea of doing Design Thinking, get them implemented into the projects that are currently going on. [...] I think that will be a real game-changer for your data scientists and your organization as a whole...” - Bill  (42:19)

Building a SAAS business that focuses on building a research tool, more than building a data product, is how Jonathan Kay, CEO and Co-Founder of Apptopia frames his company’s work. Jonathan and I worked together when Apptopia pivoted from its prior business into a mobile intelligence platform for brands. Part of the reason I wanted to have Jonathan talk to you all is because I knew that he would strip away all the easy-to-see shine and varnish from their success and get really candid about what worked…and what hasn’t…during their journey to turn a data product into a successful SAAS business. So get ready: Jonathan is going to reveal the very curvy line that Apptopia has taken to get where they are today. 

In this episode, Jonathan also describes one of the core product design frameworks that Apptopia is currently using to help deliver actionable insights to their customers. For Jonathan, Apptopia’s research-centric approach changes the ways in which their customers can interact with data and is helping eliminate the lull between “the why” and “the actioning” with data.

Here are some of the key parts of  the interview:

An introduction to Apptopia and how they serve brands in the world of mobile app data (00:36) The current UX gaps that Apptopia is working to fill (03:32) How Apptopia balances flexibility with ease-of-use  (06:22) How Apptopia establishes the boundaries of its product when it’s just one part of a user’s overall workflow (10:06) The challenge of “low use, low trust” and getting “non-data” people to act (13:45) Developing strong conclusions and opinions and presenting them to customers (18:10) How Apptopia’s product design process has evolved when working with data, particularly at the UI level (21:30) The relationship between Apptopia’s buyer, versus the users of the product and how they balance the two (24:45) Jonathan’s advice for hiring good data product design and management staff (29:45) How data fits into Jonathan’s own decision making as CEO of Apptopia (33:21) Jonathan’s advice for emerging data product leaders (36:30)

Quotes from Today’s Episode  

“I want to just give you some props on the work that you guys have done and seeing where it's gone from when we worked together. The word grit, I think, is the word that I most associate with you and Eli [former CEO, co-founder] from those times. It felt very genuine that you believed in your mission and you had a long-term vision for it.” - Brian T. O’Neill (@rhythmspice) (02:08)

“A research tool gives you the ability to create an input, which might be, ‘I want to see how Netflix is performing.’ And then it gives you a bunch of data. And it gives you good user experience that allows you to look for the answer to the question that’s in your head, but you need to start with a question. You need to know how to manipulate the tool. It requires a huge amount of experience and understanding of the data consumer in order to actually get the answer to the question. For me, that feels like a miss because I think the amount of people who need and can benefit from data, and the amount of people who know how to instrument the tools to get the answers from the data—well, I think there’s a huge disconnect in those numbers. And just like when I take my car to get service, I expected the car mechanic knows exactly what the hell is going on in there, right? Like, our obligation as a data provider should be to help people get closer to the answer. And I think we still have some room to go in order to get there.” - Jonathan Kay (@JonathanCKay) (04:54)

“You need to present someone the what, the why, etc.—then the research component [of your data product] is valuable. And so it’s not that having a research tool isn’t valuable. It’s just, you can’t have the whole thing be that. You need to give them the what and the why first.” - Jonathan Kay (@JonathanCKay) (08:45) “You can't put equal resources into everything. Knowing the boundaries of your data product is important, but it's a hard thing to know sometimes where to draw those. A leader has to ask, ‘am I getting outside of my sweet spot? Is this outside of the mission?’ Figuring the right boundaries goes back to customer research.” - Brian T. O’Neill (@rhythmspice) (12:54)

“What would I have done differently if I was starting Apptopia today? I would have invested into the quality of the data earlier. I let the product design move me into the clouds a little bit, because sometimes you're designing a product and you're designing visuals, but we were doing it without real data. One of the biggest things that I've learned over a lot of mistakes over a long period of time, is that we've got to incorporate real data in the design process.” - Jonathan Kay (@JonathanCKay) (20:09)

“We work with one of the biggest food manufacturer distributors in the world, and they were choosing between us and our biggest competitor, and what they essentially did was [say] “I need to put this report together every two weeks. I used your competitor’s platform during a trial and your platform during the trial, and I was able to do it two hours faster in your platform, so I chose you—because all the other checkboxes were equal. However, at the end of the day, if we could get two hours a week back by using your tool, saving time and saving money and making better decisions, they’re all equal ROI contributors.” - Jonathan Kay on UX (@JonathanCKay) (27:23)

“In terms of our product design and management hires, we're typically looking for people who have not worked at one company for 10 years. We've actually found a couple phenomenal designers that went from running their own consulting company to wanting to join full time. That was kind of a big win because one of them had a huge breadth of experience working with a bunch of different products in a bunch of different spaces.”- Jonathan Kay (@JonathanCKay) (30:34)

“In terms of how I use data when making decisions for Apptopia, here’s an example. If you break our business down into different personas, my understanding one time was that one of our personas was more stagnant. The data however, did not support that. And so we're having a resource planning meeting, and I'm saying, ‘let's pull back resources a little bit,’ but [my team is] showing me data that says my assumption on that customer segment is actually incorrect. I think entrepreneurs and passionate people need data more because we have so much conviction in our decisions—and because of that,I'm more likely to make bad decisions. Theoretically good entrepreneurs should have good instincts, and you need to trust those, but what I’m saying is, you also need to check those. It's okay to make sure that your instinct is correct, right? And one of the ways that I’ve gotten more mature is by forcing people to show me data to either back up my decision in either direction and being comfortable being wrong. And I am wrong at least half of the time with those things!” - Jonathan Kay (@JonathanCKay) (34:09)