Rosenverse

Log in or create a free Rosenverse account to watch this video.

Log in Create free account

100s of community videos are available to free members. Conference talks are generally available to Gold members.

A Mixed Method Approach to Validity to Help Build Trust
Friday, April 28, 2023 • QuantQual Interest Group (Rosenfeld Community)
Share the love for this talk
A Mixed Method Approach to Validity to Help Build Trust
Speakers: Chris Engledowl
Link:

Summary

Quantitative instruments are frequently sought because 1) they can be quickly fielded to lots and lots of people, and 2) when carefully sampled, they can be generalizable to the population of users/customers. However, because many times the focus is on speed to launch because decision-makers need results quickly, there is not much depth given to their development, nor an investigation of the validity evidence. In the session, I will share a framework that centers validity and is necessarily a mixed methods approach to research. I will also share ideas on how to scale the research over time so that findings and insights are able to be iteratively delivered to stakeholders, while also iteratively informing one another in a qual-quant research dance that brings more trustworthy, user-centered evidence to decision-makers. Finally, I will share ideas for a course I am developing for supporting qualitative researchers to become more mixed in their approach.

Key Insights

  • Validity in surveys is multi-faceted, relying on five evidence sources: test content, response processes, internal structure, relation to other variables, and consequences of testing.

  • Mixed methods combining qualitative cognitive interviews and quantitative analysis enhance survey validity and build stakeholder trust.

  • Breaking down survey validation efforts across multiple teams makes the process more manageable and effective.

  • Iterative survey development over multiple rounds helps improve instrument quality while balancing the need for timely insights.

  • Qualitative research plays a vital role even within quantitative validity frameworks by revealing respondent interpretation and cognitive processes.

  • Careful stakeholder engagement and communicating rapid but incremental insights increases buy-in for rigorous validity processes.

  • Survey validity is closely tied to ethical considerations, including the impact on respondents and responsible data use.

  • Significant product or user base changes necessitate revisiting and revising surveys to maintain validity.

  • Statistical methods like factor analysis and Rasch modeling help detect underlying constructs and response biases across subpopulations.

  • It is often necessary to accept imperfect early versions of surveys, improving them progressively while acknowledging limits to change-over-time comparisons.

Notable Quotes

"Validity is the degree to which evidence and theory support the interpretations of test scores for proposed uses."

"Qualitative research is vital for establishing validity in mixed methods because it helps us understand how respondents interpret questions."

"Surveys are products too—they need to be iteratively tested and refined."

"You don’t know what you don’t know—surveys have blind spots that qualitative techniques can help reveal."

"If you have broader research goals shared across quant and qual teams, then the overlap supports answering difficult validity questions."

"Conversations with stakeholders need to focus on delivering usable information quickly, not just on the validity process itself."

"Consequences of testing include ethical considerations about how survey responses affect user experience and product decisions."

"It’s better to partner with quantitative experts if you don’t have that expertise yourself to understand internal structure analyses."

"If the survey doesn’t work well for subpopulations, focus initially on groups where it’s reliable while you investigate others."

"Improving surveys iteratively can undermine longitudinal measures, so it’s critical to balance validity and tracking over time."

Ask the Rosenbot
Kristin Wisnewski
Measuring What Matters
2019 • DesignOps Summit 2019
Gold
Edward Cupps
The Principal Path: Journeying from Management to Individual Contributor
2021 • Design at Scale 2021
Gold
Francesca Barrientos, PhD
You Need Your Own Definition of Design Maturity
2022 • Design at Scale 2022
Gold
Iram Shah
Closing Keynote: The View from the Top
2019 • Enterprise Experience 2019
Gold
Sam Yen
Driving Organizational Change Through Design? Do more of this and less of that
2017 • Enterprise Experience 2017
Gold
Bria Alexander
Opening Remarks
2021 • DesignOps Summit 2021
Gold
Nathan Curtis
Design Systems for Us: How Many One-Source(s)-of-Truth Are Enough? (Videoconference)
2019 • DesignOps Community
Amber Knabl
Empowering innovation: The critical role of inclusive product development in the AI era
2024 • Designing with AI 2024
Gold
Roberta Dombrowski
Making Research a Team Sport
2022 • Advancing Research 2022
Gold
Veevi Rosenstein
Building for Scale: Creating the Zendesk UX Research Practice
2024 • Enterprise Experience 2020
Gold
Renee Bouwens
Landing Product Impact: Aligning Research as a Foundational Driver for Delivering the World’s Best Products (Videoconference)
2023 • QuantQual Interest Group (Rosenfeld Community)
Tristin Oldani
Turning awareness into action with Climate UX
2025 • Climate UX Interest Group (Rosenfeld Community)
Ari Zelmanow
Dark Side of Democratization (Videoconference)
2023 • Advancing Research Community
Phil Hesketh
Designing Accessible Research Workflows
2021 • DesignOps Summit 2021
Gold
Miles Orkin
Creativity and Culture
2018 • DesignOps Summit 2018
Gold
Joe Meersman
Use AI to Drive Outcomes that Go Beyond the Design Sprint
2024 • DesignOps 2024
Gold

More Videos

Alex Hurworth

"Each species lost is a thread untethered from the web of life that supports us all."

Alex Hurworth Bonnie John Fahd Arshad Antoine Marin

Designing a Contact Tracing App for Universal Access

October 23, 2020

Laine Riley Prokay

"Time commitment for onboarding interns was close to 25%, maybe 50%, of at least two weeks, easing as they became familiar with roles."

Laine Riley Prokay Lisa Gordon

Carving a Path for Early Career DesignOps Practitioners

September 9, 2022

Eniola Oluwole

"Launching a design system is not a sprint, there’s no end, it’s always a continuous process."

Eniola Oluwole

Lessons From the DesignOps Journey of the World's Largest Travel Site

October 24, 2019

Nathan Shedroff

"Most strategy is done pretty poorly; it's misleading, sloppy, and often ignored after it's produced."

Nathan Shedroff

Double Your Mileage: Use Your Research Strategically

March 31, 2020

Sam Proulx

"Mobile live captions can caption any sounds around the user in real time, making it a powerful accessibility tool."

Sam Proulx

Mobile Accessibility: Why Moving Accessibility Beyond the Desktop is Critical in a Mobile-first World

November 17, 2022

Feleesha Sterling

"Foundational research answers big, nebulous questions; rapid research focuses on specific usability questions."

Feleesha Sterling

Building a Rapid Research Program (Videoconference)

May 18, 2023

Neil Barrie

"Starting with specific, meaningful priority communities rather than homogenized personas leads to powerful breakthroughs."

Neil Barrie

Widening the Aperture: The Case for Taking a Broader Lens to the Dialogue between Products and Culture

March 25, 2024

John Devanney

"You can’t measure long-term customer relationship value with short-term KPIs."

John Devanney

The Design Management Office

November 6, 2017

Katy Mogal

"Small experiments that fit into existing structures make it easier to involve collaborators and reduce resistance."

Katy Mogal

But Do Your Insights Scale?

March 12, 2021