Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Integrating Qualitative and Quantitative Research from Discovery to Live
Summary
Over the last three years Southampton University in the UK has been doing a complete website redesign following an Agile process; user research and performance analytics have been an integral part of the Agile process from the very start. The website has a variety of products aimed at potential students and research collaborators. Performance analytics informed the business case and objectives and qualitative research in Discovery uncovered the user needs to improve the user experience. During alpha qualitative research informed the design of the early prototypes and in beta analytics and user research integrated qual and quant in a variety of metrics around performance and the user experience. The same metrics have been taken forward and enhanced in Live to ensure continuous improvement which sits alongside the new product roadmap. The presentation will outline the integration of qual and quant and give examples of what has been done, the metrics, and how they are informing the user experience and business objectives.
Key Insights
-
•
The University of Southampton faced over 4 million URLs due to decentralized content, causing poor user experience and high bounce rates.
-
•
Replacing 1000+ web content authors with a skilled specialist team improved content quality, consistency, and sustainability.
-
•
Integrating quantitative analytics with qualitative user research enabled a comprehensive understanding of user journeys and needs.
-
•
Measuring over 30 distinct user journeys allowed targeted benchmarking and direct evaluation of site performance.
-
•
Adopting user-centered design was a cultural shift from the university’s previous waterfall approach.
-
•
The beta phase attracted over 4,000 users, providing rich data to validate design choices before full launch.
-
•
Heat maps and recordings revealed user behaviors like 'click rage' where users clicked on non-clickable elements.
-
•
Microsoft reaction cards helped capture user sentiment efficiently, complementing free-text feedback.
-
•
Continuous improvement is enabled by automated data analysis using R scripts and a process to triage issues for design sprints or backlog.
-
•
Stakeholder involvement in observing real user sessions is crucial to overcoming resistance and building empathy for user-centered design.
Notable Quotes
"We are not collecting vanity analytics, the data is doing real work on informing design."
"85% of people were leaving after viewing only one page, showing severe bounce issues."
"The new site had a 19% dropout rate compared to 32% on the old site, showing clear improvement."
"User journeys are the bedrock for deciding which data to collect and measure."
"Some stakeholders still prefer their beliefs to data, even in academia."
"Heat maps showed users clicking on areas they thought were clickable but weren’t, resulting in click rage."
"We brought all the user journeys together with UX designers, content designers, product owners — it wasn’t just analytics and research in isolation."
"Having the qualitative and quantitative data together lets us tell a comprehensive story to stakeholders."
"We started analytics at the beginning so we could benchmark and compare old and new site performance."
"Agile can feel like a cult, but the data helps show it’s working and not just a belief system."
Or choose a question:
More Videos
"If a dialog pops up but the keyboard focus isn’t moved into it, a screen reader user might not know it’s there."
Sam ProulxEverything You Ever Wanted to Know About Screen Readers
June 11, 2021
"Please put your questions inside the thread specific to the talk so we can keep everything organized."
Bria AlexanderOpening Remarks
November 17, 2022
"Negotiating severance is absolutely possible; consider legal advice or coaching when doing so."
Corey Nelson Amy SanteeLayoffs
November 15, 2022
"The purpose of a system is what it does—so you can measure an enterprise by its outcomes, not just by its slogans or strategy documents."
Milan GuentherA Shared Language for Co-Creating Ambitious Endeavours
June 6, 2023
"If we can’t provide research as a service at scale, we need to make sure the teams are equipped to do it on their own."
Erin May Roberta Dombrowski Laura Oxenfeld Brooke HintonDistributed, Democratized, Decentralized: Finding a Research Model to Support Your Org
March 10, 2022
"No one uses the screen reader with out-of-the-box settings; users customize it to fit their needs."
Sam ProulxUnderstanding Screen Readers on Mobile: How And Why to Learn from Native Users
June 6, 2023
"We used AI tools like NotebookLM and Gemini primarily to shortcut getting up to speed and managing transcripts, without focusing on specific tools themselves."
Mujtaba HameedThe new horizon of ethnography: using AI to unlock the full potential of in-person research
March 11, 2026
"Whoever has been and might continue to be the most negatively impacted absolutely needs to be centered and have their voices at the table."
Ilana LipsettAnticipating Risk, Regulating Tech: A Playbook for Ethical Technology Governance
December 10, 2021
"Accessibility is innovation and the kinds of features people with disabilities need are incredible conveniences for the rest of us."
Samuel ProulxFrom Standards to Innovation: Why Inclusive Design Wins
September 10, 2025
Latest Books All books
Dig deeper with the Rosenbot
What strategies motivate cross-functional teams to input their research findings into a centralized repository?
How do research operations teams manage the tension between consolidating tools and needing specialized capabilities?
What alternative decision-making structures support more equitable power sharing beyond majority voting?