Usability testing

First-time use research: onboarding students to BrainPOP Science

Student with brown hair on a school laptop using BrainPOP Science for the first time.

Project overview

  • My role: Lead UX Researcher

  • Team: Science, UXR, Design

  • Project Context: Our team developed a new product, BrainPOP Science (BPS), designed for middle school students and teachers. Our business goal was to increase usage frequency. Before a broader launch, we needed to understand the first-time user experience (FTU) to identify and address any potential friction points.

  • Methodology: Fieldwork / Classroom Observation

  • Participants: One teacher and her class of middle school students using BPS for the very first time.

BrainPOP Science homepage

Research question

The success of a new product highly depends on its first impression. If the onboarding process is difficult or frustrating, users are unlikely to return. The primary research question was: How do non-BrainPOP students use BrainPOP Science for the first time?

The goal was to observe students and a teacher in a live classroom setting to pinpoint pain points, what delighted them, and how we could improve the initial experience. I focused on two key user actions:

  • Account setup and first assignment creation.

  • Student navigation and interaction with a new scientific investigation.

My role and process

As the lead UX Researcher on this project, I was responsible for the entire research lifecycle.

  1. Planning: I partnered with our Science and Design teams to design a fieldwork study that would give us an authentic look into the classroom. We set clear research questions and established a plan for observing the teacher-led and individual student experiences.

  2. Execution: I conducted a live, in-person observation of a classroom during their first BPS session. I documented the teacher’s instructions, student interactions, and the time spent on key tasks. This provided rich, qualitative data that our previous analytics couldn't capture.

  3. Analysis: Using tools like Dovetail and Condens, I synthesized the raw observation footage and transcripts. I identified key themes, isolated pain points, and translated my findings into four actionable takeaways that would directly inform the product roadmap.

Science teacher sits at her desk looking at her laptop while students work on BrainPOP Science for the first time.

Key findings

My research uncovered four critical takeaways about the student and teacher first-time use experience.

  1. Onboarding was simple but slow. It took the class 18 minutes to set up accounts and receive their first assignment. While the teacher successfully created the assignment in just 4 minutes, students spent a significant amount of time setting up accounts and waiting for their classmates.

  2. Students were highly engaged by the simulation. Students used words like "fun" and "relaxing" to describe their experience with the interactive simulation. However, this high engagement caused them to lose track of time and the need to progress through the investigation, which was a point of friction for the teacher.

  3. Vocabulary alignment was a key point of delight for the teacher. The teacher was pleased to find that the vocabulary used in one of our videos was a perfect match for a recent lesson in her class. This demonstrated a strong opportunity to integrate our product with existing classroom curricula.

  4. Teachers value high-quality student observations. The teacher emphasized the importance of making "good observations" 9 different times, using phrases like "helpful notes," "facts," and "evidence." She needed students to understand how their observations would be used later to answer more complex scientific questions.

Student with brown hair on a school laptop using BrainPOP Science for the first time.

🎯 Recommendations and impact

Based on these findings, I provided the team with actionable recommendations that would improve the initial user experience and support our goal of increasing usage frequency.

  • Recommendation 1 (Onboarding): To reduce downtime and manage classroom chatter, I recommended creating a more engaging "congratulations" page. This could include mini-games, fun science facts, or puzzles for students to play with while the teacher helps their classmates catch up.

  • Recommendation 2 (Engagement): To help students balance exploration with progression, I recommended including a light, subtle in-product message that alerts them to the time they've spent on a single area of the simulation. This would gently guide them toward the next step without interrupting their engagement.

  • Recommendation 3 (Vocabulary): To empower students to answer questions more confidently, I recommended including key vocabulary directly within the simulations. This would reinforce learning and reduce a point of friction for both the teacher and students.

  • Recommendation 4 (Observations): To better support the teacher’s instructional goals, I recommended including small reminders or video tips early on in the investigation. These nudges would explain how observations function as "evidence" and will be used later on in the scientific process.

These recommendations helped the team prioritize key improvements that would reduce friction in the onboarding process and improve the alignment of the product with real-world classroom needs. My research provided a clear path forward to ensuring our product's first impression was a strong, positive, and educational one.

Previous
Previous

UX Research - Personas and journey maps for Thumbtack

Next
Next

Support emails for Dell