Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
Over the last three years Southampton University in the UK has been doing a complete website redesign following an Agile process; user research and performance analytics have been an integral part of the Agile process from the very start. The website has a variety of products aimed at potential students and research collaborators. Performance analytics informed the business case and objectives and qualitative research in Discovery uncovered the user needs to improve the user experience. During alpha qualitative research informed the design of the early prototypes and in beta analytics and user research integrated qual and quant in a variety of metrics around performance and the user experience. The same metrics have been taken forward and enhanced in Live to ensure continuous improvement which sits alongside the new product roadmap. The presentation will outline the integration of qual and quant and give examples of what has been done, the metrics, and how they are informing the user experience and business objectives.
Key Insights
-
•
The previous website had over 4 million URLs with inconsistent content quality leading to high bounce rates—85% left after one page.
-
•
Replacing 1000+ diverse content authors with a smaller specialized team improved content reliability and maintainability.
-
•
Integrating quantitative analytics (GA4, funnels) with qualitative research (heat maps, recordings) allowed a holistic view of user behavior.
-
•
Mapping 30 detailed user journeys and directly linking them to analytics funnels was key to tracking user progress and success.
-
•
Beta testing with 4,000+ volunteer users enabled large-scale testing and informed decisions before full launch.
-
•
Applying a regression formula to derive System Usability Scale scores from just two questions simplified usability benchmarking.
-
•
Heat maps revealed 'click rage' where users mistakenly believed non-clickable elements were interactive, highlighting UX issues not visible in pure analytics.
-
•
Combining research insights with analytics enabled proactive monitoring and continuous improvement rather than reactive fixes.
-
•
Stakeholder engagement, especially involving them in observing user sessions, is crucial to overcome resistance and build trust in data-driven design.
-
•
The project balanced commercial pressures with academic complexity by framing success through clear KPIs focused on user needs and business goals.
Notable Quotes
"We are not collecting vanity analytics here, this data is doing real work informing design decisions."
"Some people prefer their beliefs to data even in academia, which may surprise you."
"The website redesign is much more than a redesign; it’s a complete change in content and design strategy."
"85% of people were leaving after just looking at one page on the old site—that was a huge problem."
"Over 4,000 users volunteered to look at the beta version, giving us excellent data at scale."
"Heat maps don’t just show where people click; they show where people think something is clickable but it isn’t, which causes frustration."
"We use a magic R studio button that, with one click, processes all the survey data automatically."
"Stakeholders often have strong opinions; our job is to back decisions with solid, evidence-based data."
"The System Usability Scale is a powerful benchmarking tool, and reducing it to two questions makes it easier to collect."
"Bringing stakeholders to observe user sessions helps them see that user-centered design isn’t just a cult."
Or choose a question:
More Videos
"If you miss any one of access, accuracy, insight generation, and engagement, it’s not truly democratization."
Jemma Ahmed Steve Carrod Chris Geison Dr. Shadi Janansefat Christopher NashDemocratization: Working with it, not against it [Advancing Research Community Workshop Series]
July 24, 2024
"Customers rarely ask us to fix inconsistencies directly; we must watch activation rates and confusion instead."
Nina JurcicThe Design System Rollercoaster: From Enabler and Bottleneck to Catalyst for Change
October 3, 2023
"You have to treat the design system as a product that delivers features and releases over increments, never really done."
Nathan Curtis Nalini P. Kotamraju Jack Moffett Dawn ResselDiscussion
June 9, 2016
"Measurements are always a conversation; they should not be a covenant imposed on you, especially when you’re trying something new."
Saara Kamppari-Miller Nicole Bergstrom Shashi JainKey Metrics: Comparing Three Letter Acronym Metrics That Include the Word “Key”
November 13, 2024
"The re-platforming journey is transformative not just for the product, but for the people and teams involved."
Malini RaoLessons Learned from a 4-year Product Re-platforming Journey
June 9, 2021
"Heat maps don’t just show where people click; they show where people think something is clickable but it isn’t, which causes frustration."
Mackenzie Cockram Sara Branco Cunha Ian FranklinIntegrating Qualitative and Quantitative Research from Discovery to Live
December 16, 2022
"If you want to talk about the things you’re learning, use the hashtag DAS 2021 on Twitter, Instagram, or LinkedIn."
Bria AlexanderOpening Remarks
June 9, 2021
"If the investment isn’t paying out, we actually recommend that you divert that investment to another area."
Jackie HoLead Effectively While Preserving Team Autonomy with Growth Boards
January 8, 2024
"Who decides what code is good for, what humans are good at, and what nature is good at?"
Dan HillDesigning for the infrastructures of everyday life
June 4, 2024