Overview
Solution and Impact
Users did not trust the information in the original onboarding. So we redesigned the flow: we made it less text-heavy and technical. This led to 17% higher successful onboarding in later usability testing.
Problem
The PathCheck Foundation had developed a digital contact tracing app. They were working toward securing contracts with state and local health departments for it. They had developed their app without UX research, and with minimal UX design work.
To win contracts, they needed to raise their design quality to match the competition. So they decided to start with testing the initial experience, the onboarding.
Methods
Remote Moderated Usability Testing, Onboarding Redesign
Background
I joined the PathCheck Foundation, a nonprofit open-source digital pandemic response organization, in July 2020 as the first member of the Product team. We were a small, cross-functional and distributed team consisting of a UX Researcher (me), 2 UX Designers, a Product Manager, and a UX Research Ops intern. PathCheck was working toward securing contracts with jurisdictions for a digital contact tracing app, and catching up to the competition.
Previously, UX design had been a supporting function to Engineering, and UX research was almost nonexistent. The organization had had a single in-house foundational UX research study months earlier in April, and an outside researcher had conducted an independent usability study of the app that went almost unnoticed. So app design decisions were driven by engineering, assumptions, and best practices, but not by data.
Objective
Our mission was to drive adoption by improving the onboarding experience in a 2-week Agile sprint. We hypothesized that having the end-user licensing agreement (EULA) before seeing the value proposition was reducing willingness to use the app. We also knew that many users did not understand the app, which reduced trust, from a single prior study by an outside researcher.
The pandemic response in partner jurisdictions would be less effective with lower adoption, so it was vital for our brand and for our customers (the departments of health) to have a high adoption rate. By improving the onboarding and thus adoption we could show jurisdictions that we could build a competitive app and help them slow the spread of COVID.
Team
Product Manager, 2 UX Designers, UX Researcher (me), UX Research Ops Intern
My Role
Design and lead usability research, including moderated test sessions, data analysis, and recommendations
Work
We conducted a remote, moderated usability study of prototyped designs in Figma over Google Meet. We wanted to get feedback on something as close to the actual experience in context as possible, so as to quickly attempt to validate our design hypotheses and iterate.
We planned the study, screened participants, and selected a diverse panel of 4 potential users through Respondent.io. Participants explored the app via the onboarding, both including app overview and app permissions, and were asked to explain how the technology worked.
FIGURE 2. SCREENSHOTS OF ONBOARDING BEFORE REDESIGN - TEXT-HEAVY.
We found that users were less concerned with EULA placement than with understanding how the technology protects privacy, and with other trust signals (e.g. brand, app store rating). Users struggled to understand how the tech worked, which undermined trust and decreased willingness to use the app.
Barriers and Constraints
We had only the 2 weeks of the sprint for the end-to-end research process, so that design work could be done in the next sprint
We had limited budget for participant incentives, as a nonprofit in a startup phase - only enough for 4
We did not have any kind of research operations, so we chose tools, set up templates and workflows, and built out the operations we needed while doing the work
Participants had difficulty loading the prototype on their devices and sharing their screens, which was our original protocol, so we had to pivot.
Impact
Based on our findings, we redesigned the onboarding to be more visual and friendly, and to better explain the technology.
The newly redesigned onboarding increased app trust by 17% in subsequent usability studies.
The app with the new onboarding is now live in Guam, Minnesota, and Hawaii, and on track to be launched in 1 more state and 3 countries in the coming months.
FIGURE 3. SCREENSHOTS OF REDESIGNED ONBOARDING
Reflection
I was able to lead the team in designing, running, and analyzing a lightweight qualitative usability study in 2 weeks. The study revealed key insights that improved the onboarding design and have made it into the app in the public.
I attempted to have the participants share their screen while testing the prototype on their computer, but the first participant struggled and ultimately failed to do so. Instead, I quickly pivoted to sharing the prototype from my screen and having participants say what they wanted to click on (or scroll, etc.) - acted as their “human mouse”. By doing so, I was able to collect usable, valuable data from all participants despite the technical challenges.
If I were to run the study again, I would plan for analysis and synthesis more ahead of time. Specifically, I would summarize key issues and findings at the end of each session, so that designers could begin to solve problems as soon as they were identified. I would choose a platform, like Zoom, that would allow the participants to remotely control my screen. I would also consider including more participants, time and budget permitting.