Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
Quantitative instruments are frequently sought because 1) they can be quickly fielded to lots and lots of people, and 2) when carefully sampled, they can be generalizable to the population of users/customers. However, because many times the focus is on speed to launch because decision-makers need results quickly, there is not much depth given to their development, nor an investigation of the validity evidence. In the session, I will share a framework that centers validity and is necessarily a mixed methods approach to research. I will also share ideas on how to scale the research over time so that findings and insights are able to be iteratively delivered to stakeholders, while also iteratively informing one another in a qual-quant research dance that brings more trustworthy, user-centered evidence to decision-makers. Finally, I will share ideas for a course I am developing for supporting qualitative researchers to become more mixed in their approach.
Key Insights
-
•
Validity in surveys is multi-faceted, relying on five evidence sources: test content, response processes, internal structure, relation to other variables, and consequences of testing.
-
•
Mixed methods combining qualitative cognitive interviews and quantitative analysis enhance survey validity and build stakeholder trust.
-
•
Breaking down survey validation efforts across multiple teams makes the process more manageable and effective.
-
•
Iterative survey development over multiple rounds helps improve instrument quality while balancing the need for timely insights.
-
•
Qualitative research plays a vital role even within quantitative validity frameworks by revealing respondent interpretation and cognitive processes.
-
•
Careful stakeholder engagement and communicating rapid but incremental insights increases buy-in for rigorous validity processes.
-
•
Survey validity is closely tied to ethical considerations, including the impact on respondents and responsible data use.
-
•
Significant product or user base changes necessitate revisiting and revising surveys to maintain validity.
-
•
Statistical methods like factor analysis and Rasch modeling help detect underlying constructs and response biases across subpopulations.
-
•
It is often necessary to accept imperfect early versions of surveys, improving them progressively while acknowledging limits to change-over-time comparisons.
Notable Quotes
"Validity is the degree to which evidence and theory support the interpretations of test scores for proposed uses."
"Qualitative research is vital for establishing validity in mixed methods because it helps us understand how respondents interpret questions."
"Surveys are products too—they need to be iteratively tested and refined."
"You don’t know what you don’t know—surveys have blind spots that qualitative techniques can help reveal."
"If you have broader research goals shared across quant and qual teams, then the overlap supports answering difficult validity questions."
"Conversations with stakeholders need to focus on delivering usable information quickly, not just on the validity process itself."
"Consequences of testing include ethical considerations about how survey responses affect user experience and product decisions."
"It’s better to partner with quantitative experts if you don’t have that expertise yourself to understand internal structure analyses."
"If the survey doesn’t work well for subpopulations, focus initially on groups where it’s reliable while you investigate others."
"Improving surveys iteratively can undermine longitudinal measures, so it’s critical to balance validity and tracking over time."
Or choose a question:
More Videos
"SAFe risks becoming codified bureaucracy that sidelines the customer, but it holds potential if adapted to foster real business agility."
Jack MoffettSAFe or Sorry? (Videoconference)
May 29, 2019
"Start with one country, build a pilot, empower the team, and collect ongoing feedback to scale international design efforts."
Tricia WangSCALE: Discussion
June 15, 2018
"Simplifying systems takes patience and happens slowly, often involving years of work and negotiation."
Fredrik MathesonFirst-time users, longtime strategies: Why Parkinson’s Law is making you less effective at work – and how to design a fix.
June 8, 2016
"If something was unclear or uncomfortable by just one person, that was an indication it had to be edited."
Laine Riley ProkayHow DesignOps can Drive Inclusive Career Ladders for All
September 30, 2021
"Ownership of VOC is a hot potato—we decided on a coalition with a facilitator to avoid competition."
Shipra KayanHow we Built a VoC (Voice of the Customer) Practice at Upwork from the Ground Up
September 30, 2021
"Research becomes a partner in knowledge creation and shared discovery rather than an arbiter of truth."
Kristen Guth, Ph.D.Out of the FOG: A Non-traditional Research Approach to Alignment
March 28, 2023
"We had all the right information but lacked the right people in the room to interpret it culturally."
Joi FreemanA New Vantage Point: Building a Pipeline for Multifaceted Research(ers)
March 30, 2020
"A product roadmap is a living document, it evolves with the market and users."
Victor UdoewaTheme One Intro
March 27, 2023
"When you organize design around customer journeys, features become weigh stations along the path, not the focus."
Peter MerholzCustomer-Centered Design Organizations
June 8, 2017