Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
NPS, SUS, HEART, CSAT, CES, CLI—the list of ambiguous acronyms goes on and on. Companies are under more pressure than ever to measure and quantify their results and their interactions with customers, but finding the right metric and the right approach is a challenging process that risks leaving key factors behind as you commit to one sole standard. However, what if there is another way to measure digital transformation and how people see your services? This is what this presentation will focus on. You will learn how to navigate the pitfalls of standardized metrics—with their pros and cons—and learn how to build and implement a custom metric framework that incorporates the best aspects of Net Promoter Score, Customer Effort Score, System Usability Scale, and other, into one cohesive and modern whole aimed at developing the actionability and traceability of your operations and customer services. Attendees takeaways include: how to develop a custom framework without losing benchmarking capability, identifying gaps and needs from this framework that is not focused solely on operational numbers, and how to wade through the murky waters of digital experience quantification. Bring your best questions and leave with actionable insights that you can put in use immediately.
Key Insights
-
•
Standard metrics like NPS often mask important differences in user experience across customer segments such as new and existing customers.
-
•
Metrics should separate attitudinal data (how users feel) from behavioral data (what users do) to avoid conflation and misinterpretations.
-
•
A custom digital metric framework benefits from integrating qualitative sentiment analysis of user verbatims to add context beyond numbers.
-
•
Balancing metric complexity internally with simplified reporting for business stakeholders helps drive actionable insights without overwhelming decision makers.
-
•
Applying established frameworks like Google HEART or NASA TLX can provide helpful reference points for measuring user experience multidimensionally.
-
•
Metrics need to be valid over time, allowing benchmarking across consistent, relative timeframes to detect meaningful trends.
-
•
Measuring customer sentiment includes factors like perceived trustworthiness, visual appeal, clarity, and satisfaction along specific user journeys.
-
•
Conversion rates alone don't tell the full story—total conversions and behavioral context must be considered when evaluating performance.
-
•
Business leaders often prefer single-number metrics, so combining simple scores with human stories and verbatim feedback aids adoption.
-
•
Designing metrics starts from defining clear goals, followed by specifying measure type, signals, timeframe, and scope for each metric.
Notable Quotes
"A user-centered measurement should not be one dimensional because people are complex and behavior is even more complex."
"The average NPS score might rise while some key customer groups actually have declining satisfaction."
"A metric is usually the result of a relation between two measures, not to be confused with the raw measurement itself."
"People tend to inflate negative feelings in surveys which may not reflect their actual behavior."
"We wanted the metric to be simple for users to complete, so we chose a five-point Likert scale."
"It's easy for business to focus on moving a single needle rather than juggling multiple complex factors."
"Stories and verbatims help drive the point home alongside metrics to make the results more meaningful for stakeholders."
"Google HEART is a great framework but difficult to implement fully because it relies on many signals that aren't always available."
"Metrics should be relative to a time period and benchmarked over time to retain their meaning and validity."
"Designing a metric starts with clear definitions of quantity, signals, timeframe, and scope."
Or choose a question:
More Videos
"Don't be cool, be good—work hard at managing your teams because they need you to nail it."
Adam Cutler Karen Pascoe Ian Swinson Susan WorthmanDiscussion
June 8, 2016
"UX directors often feel like this poor person in the middle here getting pulled in all these directions."
Peter MerholzThe Trials and Tribulations of Directors of UX (Videoconference)
July 13, 2023
"Human biases are the real problem behind algorithmic bias, not the algorithms themselves."
Lisa WelchmanCleaning Up Our Mess: Digital Governance for Designers
June 14, 2018
"The time for action is now, and it must be collaborative."
Vincent BrathwaiteOpener: Past, Present, and Future—Closing the Racial Divide in Design Teams
October 22, 2020
"OKRs are a tool for each of us to tidy our house and focus on what’s important."
Brenna FallonLearning Over Outcomes
October 24, 2019
"Most organizations are still asking design questions at a global level; we need to be hyper-local now."
Tricia WangSpatial Collapse: Designing for Emergent Culture
January 8, 2024
"Delivering research in small, lean increments allowed us to iterate fast and reduce bias."
Edgar Anzaldua MorenoUsing Research to Determine Unique Value Proposition
March 11, 2021
"We want to build teams with diverse skill sets so we can create a full picture during the knowledge creation phase."
Designing Systems at Scale
November 7, 2018
"Ethics evolve faster than law; just because something is legal doesn’t mean it’s ethical."
Erin WeigelGet Your Whole Team Testing to Design for Impact
July 24, 2024