Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
NPS, SUS, HEART, CSAT, CES, CLI—the list of ambiguous acronyms goes on and on. Companies are under more pressure than ever to measure and quantify their results and their interactions with customers, but finding the right metric and the right approach is a challenging process that risks leaving key factors behind as you commit to one sole standard. However, what if there is another way to measure digital transformation and how people see your services? This is what this presentation will focus on. You will learn how to navigate the pitfalls of standardized metrics—with their pros and cons—and learn how to build and implement a custom metric framework that incorporates the best aspects of Net Promoter Score, Customer Effort Score, System Usability Scale, and other, into one cohesive and modern whole aimed at developing the actionability and traceability of your operations and customer services. Attendees takeaways include: how to develop a custom framework without losing benchmarking capability, identifying gaps and needs from this framework that is not focused solely on operational numbers, and how to wade through the murky waters of digital experience quantification. Bring your best questions and leave with actionable insights that you can put in use immediately.
Key Insights
-
•
Standard metrics like NPS often mask important differences in user experience across customer segments such as new and existing customers.
-
•
Metrics should separate attitudinal data (how users feel) from behavioral data (what users do) to avoid conflation and misinterpretations.
-
•
A custom digital metric framework benefits from integrating qualitative sentiment analysis of user verbatims to add context beyond numbers.
-
•
Balancing metric complexity internally with simplified reporting for business stakeholders helps drive actionable insights without overwhelming decision makers.
-
•
Applying established frameworks like Google HEART or NASA TLX can provide helpful reference points for measuring user experience multidimensionally.
-
•
Metrics need to be valid over time, allowing benchmarking across consistent, relative timeframes to detect meaningful trends.
-
•
Measuring customer sentiment includes factors like perceived trustworthiness, visual appeal, clarity, and satisfaction along specific user journeys.
-
•
Conversion rates alone don't tell the full story—total conversions and behavioral context must be considered when evaluating performance.
-
•
Business leaders often prefer single-number metrics, so combining simple scores with human stories and verbatim feedback aids adoption.
-
•
Designing metrics starts from defining clear goals, followed by specifying measure type, signals, timeframe, and scope for each metric.
Notable Quotes
"A user-centered measurement should not be one dimensional because people are complex and behavior is even more complex."
"The average NPS score might rise while some key customer groups actually have declining satisfaction."
"A metric is usually the result of a relation between two measures, not to be confused with the raw measurement itself."
"People tend to inflate negative feelings in surveys which may not reflect their actual behavior."
"We wanted the metric to be simple for users to complete, so we chose a five-point Likert scale."
"It's easy for business to focus on moving a single needle rather than juggling multiple complex factors."
"Stories and verbatims help drive the point home alongside metrics to make the results more meaningful for stakeholders."
"Google HEART is a great framework but difficult to implement fully because it relies on many signals that aren't always available."
"Metrics should be relative to a time period and benchmarked over time to retain their meaning and validity."
"Designing a metric starts with clear definitions of quantity, signals, timeframe, and scope."
Or choose a question:
More Videos
"Policies must evolve to reflect the urgency of our situation."
Alex Hurworth Bonnie John Fahd Arshad Antoine MarinDesigning a Contact Tracing App for Universal Access
October 23, 2020
"Preparation is critical to ensure new employees have everything they need prior to their first day and the first few weeks."
Laine Riley Prokay Lisa GordonCarving a Path for Early Career DesignOps Practitioners
September 9, 2022
"Everyone wanted to know what was the official pattern and who was accountable for it."
Eniola OluwoleLessons From the DesignOps Journey of the World's Largest Travel Site
October 24, 2019
"Rather than aiming for the chief strategy officer directly, build relationships with junior strategists who are more accessible."
Nathan ShedroffDouble Your Mileage: Use Your Research Strategically
March 31, 2020
"Accessibility is an ongoing process where we iterate, improve, and expand, and mobile-first makes the journey easier."
Sam ProulxMobile Accessibility: Why Moving Accessibility Beyond the Desktop is Critical in a Mobile-first World
November 17, 2022
"Rapid research is a flexible framework for quickly executing UX research for fast and often tactical or evaluative design decisions."
Feleesha SterlingBuilding a Rapid Research Program (Videoconference)
May 18, 2023
"Culture is the most influential factor impacting people's perceptions and values today."
Neil BarrieWidening the Aperture: The Case for Taking a Broader Lens to the Dialogue between Products and Culture
March 25, 2024
"Incremental improvements and disruptive innovation require very different methods and measures."
John DevanneyThe Design Management Office
November 6, 2017
"You need to understand stakeholders’ fears, motivations, and incentives to change hearts and minds."
Katy MogalBut Do Your Insights Scale?
March 12, 2021