Disagreeing on timeline
Content was being designed by several people, quickly. We needed to remember the product’s identity to remain consistent.
Some teammates thought we didn’t have time to pause and conduct a voice and tone workshop. But I reminded the team that this would help us work more efficiently and confidently, accelerating content creation while maintaining a cohesive user experience.
My thought process:
With so many different products, users were often confused about where they were. Adding simple product differentiators helped them better navigate. In this copy, I included the grade level and value prop of the new product.
My thought process:
A product design partner and I worked on embedding “guided interactions” into the product in places where users needed it most—based on research. This reduced cognitive load for users.
We held design critiques with other designers, and brought our designs to the engineering + product team for 3 rounds of reviews.
My thought process:
I advocated for this toggle switch to give users more control over their experience. AI grading was new and not fully trusted yet. Not everyone agreed, but we leaned on the research and heard what users wanted: autonomy. “Learn More” leads users to the help article I wrote to help them better understand the AI tool.
My thought process:
We wanted users to know that they did something different. I wrote a success message let’s them know a change has been made. It also helps them better understand the AI tool.
Content design process:
First, I met with the product team (researchers, designers, product manager, engineers, and marketing) to review UX research sessions and identify the places where users needed support.
We agreed the copy should be short and simple, not distracting. Users should feel comfortable and supported, not overwhelmed.
Next, I created 3 copy design options at a design critique. I led pair writing sessions to discuss and iterate.
Then, I presented copy designs to the product team for questions and feedback.
Finally, I created the final copy designs based on feedback.
My thought process:
Users want to feel like partners in this process. This report box allows them the opportunity to play a role in shaping the AI tool and identify any hallucinations or areas of improvement.
The text language is conversational and simple.
My thought process:
I worked closely with the product manager and lead engineer to develop this help article. I wanted to break down the AI tool and LLMs in simple terms that users would understand and not feel lost reading.
Zero-to-one AI projects: AI-assisted grading and onboarding
BrainPOP Assisted Grading for CER powered by AI. Highlights from the AI project Sarah Mondestin worked on with the BrainPOP Science team.
Part 1: The situation
→ Business goal: Increase usage of the new Science product and increase the number of activities assigned to students by teachers.
At BrainPOP, I worked as a senior content designer and user researcher with the Science product team, conducting extensive user research with teachers and students in the classroom. While teachers genuinely loved our new science investigations and their impact on student learning, a critical pain point emerged: grading was tedious and time-consuming. My direct observations in classrooms confirmed two major issues:
The overall product onboarding for the science platform was confusing.
The grading process was a burden.
We recognized an opportunity to leverage AI to alleviate this grading pain and help meet our business goal of increasing the number of activities assigned to students by teachers. However, AI-assisted tools were relatively new concepts for many teachers at the time. In user research, teachers expressed apprehension and a need for significant guidance and reassurance to trust AI tools.
I collaborated in an Agile environment with learning designers, product designers, user researchers, engineers, and product and project managers.
My primary tasks were to:
Create voice and tone guidelines as a “North Star” for content.
Simplify the overall onboarding experience for new teachers and users.
Design free trial content to encourage teachers to learn about and use the new product and tools.
Design the content support for a new AI-powered grading tool, using a large language model (LLM) to provide grading feedback comments and scores, to make grading easier and faster for teachers. This involved ensuring teachers felt comfortable and in control of using AI to assist with their grading.
Part 2: The task
Instead of simply focusing on a new AI tool, I cataloged every piece of data and knowledge required (in-product) to achieve the business goal of getting teachers to use the product more and assign more activities. This included:
Ease of finding the product
Onboarding for the product and new features
Detailed competitor research
How the product aligned with schools’ science standards
How the product fit into a teacher’s existing lessons and grading
Part 3: The action
What I did
My primary role on the AI-powered grading tool was to be the voice of the user and the advocate for content clarity. The product was a 0-1 initiative using an LLM to provide feedback to teachers and students.
My main challenge was to make sure users understood how it worked and felt they could trust its output. I developed a content strategy that included a voice and tone chart, in-product copy guides, help articles, and a rubric for prompt engineering.
What I did
Developed a voice and tone chart to guide content by leading content workshops with team members and stakeholders. In order to create content-first designs for new AI features, I led voice and tone workshops to ensure we aligned on our product personality.
What I did
Created all the in-product copy that explained the tool's purpose and guided educators through the workflow, including microcopy for error states and AI-assisted grading toggles. This created a type of “guided interaction” throughout the product experience instead of a front-loaded onboarding with too heavy a cognitive load.
What I did
Prompt engineering: A key part of this project was collaborating with learning designers to translate our pedagogical standards into the grading rubrics that we used to feed the LLM, a form of prompt engineering, to ensure the feedback was consistent and accurate.
What I did
Developed a detailed help article that demystified the AI's functionality. Users trusted us but not the new tool. Working with engineers and learning designers, I created a simple help article to explain AI-assisted grading and LLMs to users in a way they could easily understand.
What I did
Included product differentiators and simplified the onboarding experience for users. We were preparing users to do something brand new and interact with tools they weren’t familiar with. Smooth and simple onboarding was important.
Part 4: The result
The impact of these UX content design efforts addressed the problems identified in our research:
Increased teacher comfort and adoption of AI: Teachers felt significantly more comfortable using the AI tool to assist with grading, leading to its successful adoption. It genuinely made their grading process easier and more efficient, reducing a major pain point.
Reduced cognitive load and user frustration: The simplified onboarding and clear, concise language throughout the product reduced cognitive load and minimized user frustration, making the entire BrainPOP Science platform easier and more enjoyable to use.
Decreased support inquiries: As a direct result of clearer onboarding and comprehensive, plain-language explanations for the AI tool, we observed a noticeable reduction in support inquiries related to platform usage and AI feature understanding, saving the company valuable time and resources.
Increased number of activities assigned to students: Teachers felt more comfortable using the AI-Assisted Grading tool, and the onboarding was more intuitive. This led to more activities being assigned to students and an increase of product usage.
UX content design strategically bridged the gap between complex technology (AI/LLMs) and real-world user needs (teachers’ grading), leading to greater efficiency, ease of use, and overall user satisfaction.