Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
NPS, SUS, HEART, CSAT, CES, CLI—the list of ambiguous acronyms goes on and on. Companies are under more pressure than ever to measure and quantify their results and their interactions with customers, but finding the right metric and the right approach is a challenging process that risks leaving key factors behind as you commit to one sole standard. However, what if there is another way to measure digital transformation and how people see your services? This is what this presentation will focus on. You will learn how to navigate the pitfalls of standardized metrics—with their pros and cons—and learn how to build and implement a custom metric framework that incorporates the best aspects of Net Promoter Score, Customer Effort Score, System Usability Scale, and other, into one cohesive and modern whole aimed at developing the actionability and traceability of your operations and customer services. Attendees takeaways include: how to develop a custom framework without losing benchmarking capability, identifying gaps and needs from this framework that is not focused solely on operational numbers, and how to wade through the murky waters of digital experience quantification. Bring your best questions and leave with actionable insights that you can put in use immediately.
Key Insights
-
•
Standard metrics like NPS often mask important differences in user experience across customer segments such as new and existing customers.
-
•
Metrics should separate attitudinal data (how users feel) from behavioral data (what users do) to avoid conflation and misinterpretations.
-
•
A custom digital metric framework benefits from integrating qualitative sentiment analysis of user verbatims to add context beyond numbers.
-
•
Balancing metric complexity internally with simplified reporting for business stakeholders helps drive actionable insights without overwhelming decision makers.
-
•
Applying established frameworks like Google HEART or NASA TLX can provide helpful reference points for measuring user experience multidimensionally.
-
•
Metrics need to be valid over time, allowing benchmarking across consistent, relative timeframes to detect meaningful trends.
-
•
Measuring customer sentiment includes factors like perceived trustworthiness, visual appeal, clarity, and satisfaction along specific user journeys.
-
•
Conversion rates alone don't tell the full story—total conversions and behavioral context must be considered when evaluating performance.
-
•
Business leaders often prefer single-number metrics, so combining simple scores with human stories and verbatim feedback aids adoption.
-
•
Designing metrics starts from defining clear goals, followed by specifying measure type, signals, timeframe, and scope for each metric.
Notable Quotes
"A user-centered measurement should not be one dimensional because people are complex and behavior is even more complex."
"The average NPS score might rise while some key customer groups actually have declining satisfaction."
"A metric is usually the result of a relation between two measures, not to be confused with the raw measurement itself."
"People tend to inflate negative feelings in surveys which may not reflect their actual behavior."
"We wanted the metric to be simple for users to complete, so we chose a five-point Likert scale."
"It's easy for business to focus on moving a single needle rather than juggling multiple complex factors."
"Stories and verbatims help drive the point home alongside metrics to make the results more meaningful for stakeholders."
"Google HEART is a great framework but difficult to implement fully because it relies on many signals that aren't always available."
"Metrics should be relative to a time period and benchmarked over time to retain their meaning and validity."
"Designing a metric starts with clear definitions of quantity, signals, timeframe, and scope."
Dig deeper—ask the Rosenbot:















More Videos

"We scaled panels down to about 60 countries in 11 languages focusing on markets important to Airbnb teams."
Wyatt HaymanGlobal Research Panels (Videoconference)
August 8, 2020

"We’re far from done; evolving foundational elements, accessibility, and customization tools will drive Spectrum’s future."
PJ Buddhari Nate BaldwinMeet Spectrum, Adobe’s Design System
June 9, 2021

"When we stick to just verbal descriptions, a lot of nuance about future experiences gets lost."
Sarah GallimoreInspire Progress with Artifacts from the Future
November 18, 2022

"Many product managers got their roles because they know the business or subject matter, but they don’t know how to manage product development."
Peter MerholzThe Trials and Tribulations of Directors of UX (Videoconference)
July 13, 2023

"More inclusive teams are more productive and effective, especially women-led teams that inspire community."
Dr. Jamika D. Burge Mansi GuptaAdvancing the Inclusion of Womxn in Research Practices (Videoconference)
September 15, 2022

"Showing instead of telling really sparks people’s interest and often leads engineers to fix problems even on weekends."
Amy MarquezINVEST: Discussion
June 15, 2018

"Telemetry is like sex in high school — everyone claims to do it, but most are doing it poorly."
Dane DeSutter Natalie Gedeon Deborah Hendersen Cheryl PlatzBeyond the Console: The rise of the Gamer Experience and how gaming will impact UX Research across industries (Videoconference)
May 17, 2024

"If I don’t trust you, how can I feel safe with you?"
Zariah CameronReDesigning Wellbeing for Equitable Care in the Workplace
September 23, 2024

"If goals aren’t met, remember it’s about the team, not just the individual; many factors are out of your control."
Jessica NorrisADHD: A DesignOps Superpower
September 9, 2022