Education is one of the most important industries when it comes to usability as more educational tools are developed and students of all abilities are required to use them. Because of this, W. W. Norton has been updating their ebook reader for more accessible use. The design team had also been adding new features for professors to assign interactive content to their students, such as multiple choice questions. I got to work with the team to create designs for the multiple choice frames in the Assignable Ebook tool.
Requirements
We were given a list of user stories with acceptance criteria defined by our stakeholders and product owners.
From previous research, we knew that we would need to design for users with screen magnifiers, so we added more stories related to accessibility. With guidance from our UX copywriter and accessibility consultant, we made sure the language and accessibility of the product would be prioritized starting from the very beginning of our design process.
Competitor Analysis
To see how other learning tools use multiple-choice questions, I conducted a competitive analysis with websites like Canvas, Duolingo, and Pluralsight. I tried noting features like the labeling of answer options, question feedback, and accessibility features like reporting issues that would meet the criteria for the user stories. I found that Khan Academy had the most features compared to what we needed.
Initial Designs
Some of the features we wanted for the multiple choice framers were labeled answer choices, displayed due date, and a report issue button. The question frame also needed to display some type of feedback inside.
I made two options. The first would display the feedback at the bottom of the frame. The second was a 2 x 2 format where feedback was displayed by the user's answer.
My two teammates and I shared our designs and feedback for each other.
- The second concept wouldn't work because we had to consider images as question choices.
- Questions would only need either answer specific feedback or general question feedback, never both.
- It would be best to avoid icons only for feedback, like I had in the first concept. It would be best to use labels along with them.
- The inline feedback would cause accessibility issues because of how it appears in the focus order of a magnifier. This would also take up more screen real estate that we already had a little bit of.
With these details in mind, we iterated on our designs one more time.
Iteration
For my new design, I kept a vertical layout. After submitting their answer, the correct response would be labeled for better clarity and the response feedback would only be displayed at the bottom of the frame
Final Design
I worked with the other designers to create the final design. We decided to go with a feedback modal instead of showing the feedback in one spot of the frame. This is more accessible for screen readers as it provides a new screen for them to detect. Having a new screen displays feedback more clearly since it also takes up the entire frame, and prevents errors by interacting with other elements.
This is what the frame and feedback would look like integrated into the ebook. I presented the final designs to our developers and product owners, who signed off on the feasibility and user story fulfillment of it.
User Research
With time constraints to get the rest of the Assignable Ebook's features ready for the MVP, we knew we wanted to test our current designs, not just the multiple choice questions, with 6 students.
We gave ourselves about two weeks for recruiting and one week to conduct all six usability tests.
Recruiting Survey
I wrote a recruiting survey that was distributed to all student accounts upon loading their Norton Ebook for the first time. The survey ran for three days and accumulated a total of 5,164 responses.
To find the appropriate participants for the study, I asked about things like having assignments related to their current ebook. I also provided an optional open-ended question to see how descriptive they could be for a usability test.
Usability Tests
I started off writing a draft for the usability testing script, thinking about the tasks users should complete. Some of the features we wanted to test included language within the interactive frame. We also wanted to test the completion state of the multiple choice frame to see what would help students study when going back to a completed question.
We conducted 6 one-hour sessions. I analyzed the recordings afterward to create a heatmap of the tasks users passed and failed.
Results
Before conducting the usability tests, I had some assumptions from my experience as a student using other LMS products. I was confused about some labels, like the LMS badge indicator. I thought users would also be able to understand the instructions and feedback for the multiple choice frames.
- All 6 users passed the multiple choice question tasks, though they struggled a bit trying to navigate to them in the ebook.
- Needing clearer icons for the assignment overview
- Language for the confirmation modal and LMS badge specification were confusing
- Reorganizing the elements in the assignment overview so it would be easier to understand when all questions and activities were complete.
I was surprised that users wanted to see different icons to understand the navigation. It was because the icons had no label when the side navigation was closed, which goes back to what I learned about always using labels for icons.
Overall, the usability tests gave us a lot of good feedback and validation. All 6 users passed about 95% of the tasks, but there were still improvements we could make.
Conclusion
The three months were very busy for the design team. I got to work on my ideation, wireframing, and prototyping skills by creating new design concepts. I also learned how to write a UI specification for the first time.
Writing the usability testing script was very extensive, but by looking back at all our requirements as well as coming together, we were able to write it in a short amount of time. During the usability tests, our study participants were all very thorough too.
Working to improve educational products for students was very fulfilling. As a recent student, I think I was able to bring a different perspective to the team while also relating to the users. I also learned a lot from each team member and their specializations, especially with accessibility.