Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
NPS, SUS, HEART, CSAT, CES, CLI—the list of ambiguous acronyms goes on and on. Companies are under more pressure than ever to measure and quantify their results and their interactions with customers, but finding the right metric and the right approach is a challenging process that risks leaving key factors behind as you commit to one sole standard. However, what if there is another way to measure digital transformation and how people see your services? This is what this presentation will focus on. You will learn how to navigate the pitfalls of standardized metrics—with their pros and cons—and learn how to build and implement a custom metric framework that incorporates the best aspects of Net Promoter Score, Customer Effort Score, System Usability Scale, and other, into one cohesive and modern whole aimed at developing the actionability and traceability of your operations and customer services. Attendees takeaways include: how to develop a custom framework without losing benchmarking capability, identifying gaps and needs from this framework that is not focused solely on operational numbers, and how to wade through the murky waters of digital experience quantification. Bring your best questions and leave with actionable insights that you can put in use immediately.
Key Insights
-
•
Standard metrics like NPS often mask important differences in user experience across customer segments such as new and existing customers.
-
•
Metrics should separate attitudinal data (how users feel) from behavioral data (what users do) to avoid conflation and misinterpretations.
-
•
A custom digital metric framework benefits from integrating qualitative sentiment analysis of user verbatims to add context beyond numbers.
-
•
Balancing metric complexity internally with simplified reporting for business stakeholders helps drive actionable insights without overwhelming decision makers.
-
•
Applying established frameworks like Google HEART or NASA TLX can provide helpful reference points for measuring user experience multidimensionally.
-
•
Metrics need to be valid over time, allowing benchmarking across consistent, relative timeframes to detect meaningful trends.
-
•
Measuring customer sentiment includes factors like perceived trustworthiness, visual appeal, clarity, and satisfaction along specific user journeys.
-
•
Conversion rates alone don't tell the full story—total conversions and behavioral context must be considered when evaluating performance.
-
•
Business leaders often prefer single-number metrics, so combining simple scores with human stories and verbatim feedback aids adoption.
-
•
Designing metrics starts from defining clear goals, followed by specifying measure type, signals, timeframe, and scope for each metric.
Notable Quotes
"A user-centered measurement should not be one dimensional because people are complex and behavior is even more complex."
"The average NPS score might rise while some key customer groups actually have declining satisfaction."
"A metric is usually the result of a relation between two measures, not to be confused with the raw measurement itself."
"People tend to inflate negative feelings in surveys which may not reflect their actual behavior."
"We wanted the metric to be simple for users to complete, so we chose a five-point Likert scale."
"It's easy for business to focus on moving a single needle rather than juggling multiple complex factors."
"Stories and verbatims help drive the point home alongside metrics to make the results more meaningful for stakeholders."
"Google HEART is a great framework but difficult to implement fully because it relies on many signals that aren't always available."
"Metrics should be relative to a time period and benchmarked over time to retain their meaning and validity."
"Designing a metric starts with clear definitions of quantity, signals, timeframe, and scope."
Or choose a question:
More Videos
"Users’ perception predicts attrition and paid referrals — design absolutely matters when people decide to buy or go."
Standardizing Product Merits for Leaders, Designers, and Everyone
June 15, 2018
"If I don’t have a map, I can’t see patterns or apply context-specific gameplay."
Simon WardleyMaps and Topographical Intelligence (Videoconference)
January 31, 2019
"The Reflexive Compass helps us discern bias patterns early, take accountability, and measure impact."
Sandra CamachoCreating More Bias-Proof Designs
January 22, 2025
"We’re all capable of creating and perpetuating toxic work relationships."
Darian DavisLessons from a Toxic Work Relationship
January 8, 2024
"My goal today is to showcase how generative AI can go beyond just speeding up our processes and actually catapult us in our career."
Fisayo Osilaja[Demo] The AI edge: From researcher to strategist
June 4, 2024
"Nobody wants to buy or use a sloppy product, especially when enterprise users engage daily for hours."
Uday GajendarThe Wicked Craft of Enterprise UX
May 13, 2015
"Trust is often underestimated but is critical when designers face public critique and feedback."
Davis Neable Guy SegalHow to Drive a Design Project When you Don’t Have a Design Team
June 10, 2021
"People felt designs were self-evident and too much explanation was a barrier to using the patterns."
Eniola OluwoleLessons From the DesignOps Journey of the World's Largest Travel Site
October 24, 2019
"The operational glue that binds strategy, execution, and measurement in design is often missing, and that’s critical for scaling."
Aurobinda Pradhan Shashank DeshpandeIntroduction to Collaborative DesignOps using Cubyts
September 9, 2022