Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
A Mixed Method Approach to Validity to Help Build Trust
Summary
Quantitative instruments are frequently sought because 1) they can be quickly fielded to lots and lots of people, and 2) when carefully sampled, they can be generalizable to the population of users/customers. However, because many times the focus is on speed to launch because decision-makers need results quickly, there is not much depth given to their development, nor an investigation of the validity evidence. In the session, I will share a framework that centers validity and is necessarily a mixed methods approach to research. I will also share ideas on how to scale the research over time so that findings and insights are able to be iteratively delivered to stakeholders, while also iteratively informing one another in a qual-quant research dance that brings more trustworthy, user-centered evidence to decision-makers. Finally, I will share ideas for a course I am developing for supporting qualitative researchers to become more mixed in their approach.
Key Insights
-
•
Validity involves five evidence sources: content, response processes, internal structure, relations to other variables, and consequences of testing.
-
•
Qualitative methods, especially cognitive interviews, are crucial for understanding how respondents interpret survey items, supporting validity.
-
•
Surveys should be treated as products that need ongoing iteration and testing, not one-off tools.
-
•
Ethical considerations extend beyond data privacy to how survey results affect user experience and product decisions.
-
•
Mixed methods approaches leverage both qualitative insights and quantitative analyses to build a stronger validity argument.
-
•
Breaking down survey validation work across multiple teams and 'bite-sized' efforts makes the process manageable.
-
•
Revising surveys over time to improve validity complicates measuring change longitudinally but increases trustworthiness.
-
•
Stakeholder buy-in improves when validity processes are communicated as phased insights offering tangible results quickly.
-
•
Analyses such as factor analysis and Rasch modeling reveal survey internal structure and help identify item bias across subpopulations.
-
•
It is important to revisit validity considerations continuously, especially after product or user base changes.
Notable Quotes
"Validity is the degree to which evidence and theory support the interpretations of test scores for proposed uses."
"Qualitative research in a mixed methods setting needs to think about validity to support the bigger validity argument."
"What would happen if you do think about validity? Would it change your process or research plans?"
"Surveys are products too, so they need to be iteratively tested."
"You don’t know what you don’t know. Let’s write surveys that cover those blind spots."
"If the survey wording changes, measuring change over time becomes difficult, but improving the survey builds trust."
"Ethical considerations should go beyond typical privacy reviews and think deeply about user impact throughout the process."
"It is better to partner with someone who has quantitative expertise to interpret internal structure analyses."
"Management and stakeholders often care more about getting usable information quickly than understanding the full validity process."
"We can get quick insights in phases to keep teams engaged and slowly build a complete validity argument."
Or choose a question:
More Videos
"Screen readers have two modes: browse mode for navigating content and focus mode to interact with form fields or apps."
Sam ProulxEverything You Ever Wanted to Know About Screen Readers
June 11, 2021
"If you want to avoid time zone confusion, you can switch the conference schedule to your local time on the program page."
Bria AlexanderOpening Remarks
November 17, 2022
"If you were blindsided recently, I’ve been there. I just really appreciated a kind voice because I didn’t hear a single word they said."
Corey Nelson Amy SanteeLayoffs
November 15, 2022
"Blaming the system for failures is like blaming the garden for not growing—the system is made by the people within it."
Milan GuentherA Shared Language for Co-Creating Ambitious Endeavours
June 6, 2023
"It’s not a theme unless you can put your hand over every sticky note and the label explains the story."
Erin May Roberta Dombrowski Laura Oxenfeld Brooke HintonDistributed, Democratized, Decentralized: Finding a Research Model to Support Your Org
March 10, 2022
"The most important thing you can do is listen and watch and put assumptions aside about what is easy or hard."
Sam ProulxUnderstanding Screen Readers on Mobile: How And Why to Learn from Native Users
June 6, 2023
"The core thesis is that saving time with AI and reallocating it into research itself unlocks the true potential of the field."
Mujtaba HameedThe new horizon of ethnography: using AI to unlock the full potential of in-person research
March 11, 2026
"Futures thinking is not about predicting the future, but about being smarter about anticipating risks and consequences of our actions today."
Ilana LipsettAnticipating Risk, Regulating Tech: A Playbook for Ethical Technology Governance
December 10, 2021
"You own the design system, which gives you the unique ability to integrate that accessibility thinking into all of your components."
Samuel ProulxFrom Standards to Innovation: Why Inclusive Design Wins
September 10, 2025