Picture of woman conducting a usability testing session virtually with a user.

Process

Moving from impromptu to agile

The first step in creating a viable usability testing framework was identifying and communicating to the right people why usability testing was necessary. As our tooling became the company standard, it became apparent that our style of pushing out features without visiting them again before they hit production was not going to be viable in the long term. Thus, I communicateD to my manager and higher ups that we needed to implement some form of usability testing for our features moving forward to ensure that users were able to walk through workflows without any pain points and to be reassured that we were creating these features the most sensible way. On top of this, I pointed out that the number of messages within our Slack community channels would decrease in the long term as it's assumed that any confusing workflows would be identified and corrected within our usability sessions, therefore preventing the complaints to arise in the first place.

Igniting the conversations

What I focused on next was ensuring that we had a usability testing framework in place and discussed how the broader team would execute on this for our features. Since our team worked on UI components and experiences, we came to the conclusion that we needed to focus on having our users walk through the features, ask any necessary questions related to the workflows, and probe for their thoughts when they ran into roadblocks. I was able to set up discussions within our team to form a general framework on how we would approach the testing sessions, what kind of questions we would ask, and what type of feedback we would want to hear from the users.

Executing user sessions

Once this process was finalized, the team made sure to use it the next time we were ready to release a new feature into our product. I asked for volunteers in our community channels to set up a one-on-one or sometimes two-on-one (a notetaker would attend) meeting that allowed us to walk through our new features with our users and ask them questions about what did or didn't make sense while performing certain actions. During these sessions, I emphasized having the user think out loud about what they were thinking as they performed an action and asked follow up questions when we needed more information about anything they said. The results of creating this process and meeting with users to execute these usability sessions was that a dialogue between us and the users was created, ensuring that whatever feedback that was captured was able to be properly translated into a feature improvement.

Generalizing the process

Once all of the user sessions were finished, I compiled the notes from all of the sessions and combined it into a document where everything was laid out. A meeting was then held internally within our team to go through all of the notes from each user, group pieces of feedback together, and prioritize any gaps in the user expectation vs experience or any confusing workflows. From there, I prioritized the ones that were blocking our release to General Availability and worked on fixes for these blockers. Once everything was finally completed, we released the feature to the general public, making sure to keep an eye out on any of the community channels where feedback might be placed.

Impact

The impact of the new changes were felt from the get go. The features that used the usability testing process received more acclaim from our external users as well as from our wider team internally.

There were fewer, if any, complaints about the newer features and anything that was communicated to us as a problem was immediately prioritized rather than tossed over into the backlog. On top of this, it became easier to communicate the importance of usability testing to others after we ran through the first few runs of these usability testing sessions.

“For the developers that helped developed the new onboarding feature, kudos. It has been a pleasant change.”

- Developer after changes were implemented

Reflections

Constraints

The biggest issue that occurred during this entire process was the scale as to which these usability sessions happened. An hour is a long time for anybody and over time, it became harder to gather users for usability sessions, especially since we offered no kind of compensation for their time. Therefore, it became difficult to get a wide variety of users that could walk through the experience of each feature and we had to rely solely on a pretty monotonous pool of people. Fortunately, since our product catered to only developers at Adobe, this proved to be not a huge issue. On top of this, since these sessions ran towards the end of feature development, it was much harder to go back and revise any of the fundamental user workflows that could have been identified and worked on if we kept the scope of the sessions to a smaller section of the feature.

Conclusion

It's safe to say that the usability testing sessions provided much more good for our team and beyond, as other teams (especially those that didn't work on any UI-related work) started to devise their own form of usability testing process to use with their users. I've worked with some other people in the broader team to work on a generic usability testing framework that can be broadly applied to any team that needs usability testing in their workflow.