Check Progress feature called a "lifesaver" by instructors
And earns an 83% CSAT rating
This video illustrates the student experience, with the ability to check their progress during their lab work.
Of the various features that made up TestOut’s LabSim learning platform, the hardware simulation labs were the company’s flagship offering and leading driver of sales. Teachers loved them for their instructional value, but students hated the pass/fail nature of the labs.
Through user research and an iterative design process, we arrived at a solution that allowed students to check their progress before submitting their work for a score. The “return to lab” feature earned an 83% CSAT rating a month after release. Instructors called it a “game changer” and a “lifesaver.” This feature addressed a significant customer pain point, and improved our primary revenue driver, strategically supporting growth.
Shortly after I started working at TestOut, company leadership established the Course Improvement Council (CIC) as a governing body overseeing course-related features. As the sole UX/product designer at the company, I participated in CIC meetings, helping identify and advance course improvement priorities through workshop facilitation, UX research, and design.
As the CIC got underway, several ideas surfaced as potential features to pursue. To help us make informed decisions, I proposed a process for validating feature ideas through customer research. The council approved the feature research process, and the first concept we tested was the “return to lab” feature.
Customer research validated and helped clarify the pain point: students hated the pass/fail nature of TestOut’s hardware simulation labs.
For context, hardware simulation labs consisted of a series of tasks. If a student missed even a single task, they’d receive a failing grade. But students didn’t have a way to verify task completion without submitting their work for a score. At that point, if they learned they’d failed, they’d have to redo the lab for a chance at a passing grade.
Anecdotal evidence of student complaints first brought this issue to the attention of the CIC, but, per our feature research process, it required further validation to merit addressing this problem through changes to the user experience.
TestOut’s student protection policy restricted employees from contacting students directly. So, to validate the pain point, I conducted a survey of instructors, asking them to represent their students’ views. Over 50% responded, reinforcing our anecdotal evidence that students resented the pass/fail nature of the labs. Survey respondents ranked this pain at an average of 3.42 out of 5. This was a significant source of student frustration.
Students resented having to start a lab over if they missed even a single task, resulting in a failing grade. Survey respondents rated this pain point at an average of 3.42 out of 5.
Through an iterative design process, I landed on a solution that allowed students to check their progress before submitting their work for a score, addressing the frustration students had previously experienced with the hardware simulation labs.
1st Round
My first solution exposed an incorrect assumption about the problem space: students weren’t inadvertently skipping tasks.
I designed an MVP solution, collaborating with Engineering to make sure it was feasible to implement: a checklist allowing students to mark tasks done as they went through the lab. However, when I showed this to instructors, they told me this solution missed the mark. Students weren’t overlooking tasks in the list of instructions. In most cases, they thought they had completed all of the tasks, but had done some of them incorrectly.
My first solution was to offer the students a checklist, so they wouldn't overlook tasks. I learned that this solution didn't meet the students' needs.
2nd Round
With a better understanding of the problem, I designed a solution that allowed students to check their progress while still working in the lab.
I created a modified version of TestOut’s score summary dialog, which students were used to seeing after submitting their work for a score. Whereas the score dialog included explanations for how to complete each task in the lab, the “return to lab” version, with which students could check their progress any time during the lab, excluded explanations — only showing which tasks had been completed and which hadn’t.
In contrast to my first attempt, this solution addressed the pain at the root of the issue. Students could verify their progress at any point during their lab work, and return to working in the lab as often as they wanted before submitting their work for a score. Similar to my first design, this solution was also feasible for Engineering to implement quickly, since it utilized existing architecture.
The final version addressed the pain point by allowing students to check their progress during the lab, and keep working until they were satisfied.
The CIC research process validated the pain students were experiencing in TestOut’s hardware simulation labs, and, through an iterative design process, led to a solution that resolved the students’ frustrations. The feature received an 83% CSAT rating, and impacted business by improving our leading driver of sales.
A month after the “return to lab” feature was released, we surveyed instructors. 83% cited a positive reaction from students. 17% cited a neutral reaction, while no one cited a negative reaction. One instructor said “Being able to see how far they are in the process, without having to submit and start over if it's not as complete as they believe, is a LIFESAVER.”
The ability for students to check their progress during a lab received a very positive CSAT response.
This experience both validated the research process I created, and provided a feature that improved the user experience while enhancing a key revenue generator for TestOut.
When students are satisfied with their work, they can submit it for a final score.


