The Big Question
UX Research at Ibotta was a small outgrowth of market research exclusively focused on usability testing of Figma prototypes handed off by a UI-focused design team. Upcoming B2B work plus recent investments into a D2C browser extension was an opportunity to move UXR at Ibotta into more generative, strategic discovery work.
The challenge was change management: how to shift UXR from an on-call testing service and position the team as more strategic partners while still maintaining service to and good feelings with Product and Design?
Context
The Plan
Moving the UXR team towards more strategic generative research would require mitigating the team’s current bandwidth problem: the myopic focus on usability testing.
This meant multiple moving parts:
- Prioritize and schedule project commitments
- Introduce faster evaluative methodologies
- Democratize low-stakes, low-risk usability tests
Research Team Prioritization
The UXR team had fallen into a system of responding to responding to research requests in an ad hoc manner: those PMs with relationships with individual researchers would make a request, and the work would be done.
There was no discussion around what work should be taken on or how to schedule or estimate that work. No oversight beyond the accountability of saying yes to individual project managers.
I introduced a prioritization schema to help us identify the most important work.
UX Risk Schema
My approach was lightweight: w would evaluate research requests against three variables, then average those scores to create a general “UX Risk” score that would be segmented into one of four responses.
- Project Impact: a general, vague assessment relating to the company, our consumer customers, or our business partners
- Request Uncertainty: does the request represent a well thought-out, low-risk idea… or not?
- Visibility: the company was still very founder-driven, and requests from senior leadership levels carried significant weight.
The UXR team would commit to high risk work, while the “day-to-day” usability testing usually represented low-medium risk requests.
Democratizing User Testing
Though a currently popular buzzword, the democratization of UX Research comes with risks: sloppy research, insufficient – or outright incorrect – insights, and/or the belief that asking non-researchers to conduct these activities is a trivial request.
The Solution?
- Allocate additional budget towards UserTesting.com
- Limit the scope of democratization: nothing big, nothing risky
- Enforce UXR governance
- Provide training, documentation, and practice
Initial Workshops: An Introduction to UXR
The initial series of workshops was an introduction to the history of UXR and usability testing at Ibotta – along with why and how we were evolving that work.
Testing Request Forms & Office Hours
As we trained the Product and Design orgs on usability testing methods and best practices, I introduced a common request form in Airtable for our colleagues to use.
This allowed us to get Designers familiar with the questions needed to build a full usability test plan, but also expedite the turnaround time for Researchers.
The request form was paired with new bi-weekly testing office hours hosted by the UXR team. This would allow researchers to continue to support testing requests while continuing to prioritize their calendars for more impactful research work.
Thematic Analysis Training
With designers and PMs now introduced to the pivot in managing usability testing,
I scheduled a series of workshops to train the team on thematic analysis.
Identify User Test Partner Designers – and Train Them
With the increased budget supporting additional seats in UserTesting.com’s platform, we could begin to onboard the designated Usability Testing SMEs from Design. This process involved the following activities and artifacts:
- Usability Testing training (defining types of tests, explaining research plans, breaking down the component pieces of an unmoderated usability test)
- Creating templates and re-organized workspaces within UserTesting.com itself
- Creating testing / participant trackers within Airtable
- Writing best practices and how-to documentation
Outcomes & Key Findings
- As designers became more comfortable and experienced with heuristic reviews and usability testing, the UXR team regained bandwidth, which could be spent on more strategic discovery work.
- The training wheels of “you watch me, we do, I watch you” helped Designers – as did old-school how-to documentation.
- This effort was also able to upskill the Design and Research teams with new tools and activities, particularly those more junior UI-focused designers whose experience was generally limited to Sketch and/or Figma mockups.