bot chat assistant tech ai icon
Minimal Shapes Tile Patterns Circle
Minimal Shapes Tile Patterns Square

AI-Assisted Grading

Usability Testing

@BrainPOP

Project Objective

Gauge teachers’ first impressions of an AI-assisted grading product, including attitudes about and understanding of the offering.

Methodology

Evaluative - Qualitative

Usability Testing

A/B Testing

User Interviews


Study Overview

After the development of a proprietary AI “assisted grader” to help teachers score submissions of one of BrainPOP Science’s core assignment types, we needed to test the overall design as well as compare two prototypes under consideration. The primary difference in the prototypes was the degree to which the AI’s rationale would be shown to teachers, though we were also interested in assessing teacher’s attitudes about AI use, their understanding of what the tool was doing, their comprehension of how to use it, and their thoughts on how well it would help them.


I worked primarily with the UX designer who created the prototypes, and we drafted a set of research questions getting at the above concerns and a script for usability testing, interviews, and A/B testing. Eight teachers participated in the study, and the designer and I collabroated to synthesize, interpret, and report the insights.

Findings

  • Teachers were confident in their use of the tool, easily understood the Assisted Grader content, and believed the tool would help make grading easier and faster. One teacher confirmed they would even create more assignments if they had access to this tool.
  • One of the prototypes--with less AI explanation--was slightly preferred over the other, though many said they’d prefer to have the explanation at first with a choice to turn it off.
  • Unexpectedly, teachers assumed students would see the AI rationale, and many teachers even wanted this.
  • The biggest problem: a few teachers misunderstood a re-grading mechanic, alerting us to a crucial need to clarify the functionality of the tool.

Insight excerpt from final report

Overview from final report

bot chat assistant tech ai icon
Modern Bold Swiss Elements Blue Zigzag With Arrow
Simple Geometric Shape
Simple Geometric Shape
Simple Geometric Shape
Modern Bold Swiss Elements Blue Zigzag With Arrow
Simple Geometric Shape
Simple Geometric Shape
Simple Geometric Shape

Unit Restructure

Usability Testing

@BrainPOP

Project Objective

Determine whether the UI changes made to the organizations of the BrainPOP Science product make it easy for teachers to find what they need.

Methodology

Evaluative - Qualitative

Usability Tests

A/B Tests

Study Overview

After lots of teacher feedback an on the initial designs of the BrainPOP Science product and competitive research comparing it against similar products, it was clear users wanted a different method of content organization. Primarily, our units appeared to be grouped into categories that were needlessly large and difficult to navigate between.


After others on my team completed initial concept testing to determine the primary direction to be taken, I embarked on this project with one of the UX designers on my team to further refine the changes. Here, we tested three different prototypes she created that slightly varied in their approaches to this content restructure, including changes to the type of information presented at the top of the page and some of the thumbnail choices for certain resources. We recruited eight teachers, conducted usability tests and A/B tests with the three prototypes, and worked together to synthesize the findings.


Findings

  • Teachers appreciated the overall restructure, mirroring the results of the initial concept testing.
  • Some of the information under consideration for inclusion at the top of the homepage were not seen as useful by teachers.
  • There was a lack of breadcrumbs to arrive back at the unit selection page after a certain number of clicks.
  • Some of the thumbnails for resources were seen as too “adlike” by teachers, who preferred simpler and more consistent icons.
  • A lack of contrast between clickable items and the background was noted by some teachers.
Abstract Brutalist Corner

Alignment to Core

Needfinding Interviews

@BrainPOP

Project Objective

Position BrainPOP Science as a valuable standalone product by determining how teachers are aligning supplemental science products with their core science curricula, what the greatest gaps in these core curricula are, and the factors that go into purchasing supplemental science product decisions.

Methodology

Generative - Qualitative

User Interviews

  • Jobs-to-be-Done
  • Retrospective behavior analysis
  • Think aloud

Research questions from the project proposal

Study Overview

BrainPOP Science, as a science supplemental product, was created to assist teachers in science instruction alongside the “core curricula” products they are using, often as a mandate by their school or district. Critical to positioning BrainPOP Science as a point solution was determining where teachers’ core curricula were fallling short, what jobs teachers were trying to get done outside of their core curricula, and what factors were typically being considered by administrators making purchasing decisions about such products. I developed two in-depth interview scripts for teachers and administrators and recruited eight people in each role to participate.


For teachers, I employed a Jobs-to-be-Done approach throughout, tailoring my questions to focus on the specific goals teachers were trying to accomplish in their specific circumstances when deciding to use a science supplemental product. I also engaged them in a think aloud, in which I asked them to pull up one of their science supplemental products and walk me through their process of selecting resources to use in their classroom.


For administrators, in addition to changing the nature of some of the general interview questions, I employed a retrospective behavioral analyses, in which I asked them to walk me through all of the steps they took during the purchase of their last science supplement, starting from the very first moment they realized they needed a product that accomplished what it did.


I synthesized the data with thematic coding and determined the most common core product issues, supplement use cases, and purchase decisions to share out with our team.

Excerpt from the teacher interview guide

Findings

  • I revealed two very clear areas in which teachers were frustrated with their core curricula products and where science supplements such as BrainPOP Science could shine.
  • I uncovered the three primary considerations administrators make when purchasing science supplemental products.
  • I also determined ten jobs that teachers were using supplements to accomplish; while five of them were already known from previous research by our team, the others were less common findings, and a couple even came as surprises.
  • Two emergent findings also revealed potential areas for future study.
Face Icon

Insight excerpt from final report

Opportunities and recommendations section from final report

Globe Glyph Icon
Bar Graph Icon
Abstract Minimalist Lined Grid Shape
Modern Bold Swiss Elements Orange Asterisk

Teacher Usage Data

Exploratory Quantitative Analysis

@BrainPOP

Project Objective

Investigate potential factors leading to increased BrainPOP Science usage and identify usage patterns that may be predictive of future purchase decisions.

Methodology

Evaluative - Quantitative

Exploratory statistical analyses

  • Logistic regressions
  • T-tests
  • Descriptive statistics


Research questions from the project proposal

Study Overview

This study was conducted as a part of a larger ongoing effort to understand how teachers are implementing BrainPOP Science in the classroom. Previous analyses suggested that user may be segmented based on their assignment rates of specific types of learning resources, so I investigated teacher usage data from the Spring and Fall Semesters of 2023 to see if assignment rates of these resources were related to purchase decisions, and I examined whether our training programs were influencing how teachers were using the product.


After formulating discrete and specific research questions and defining the variables to be used in the analyses, I combined multiple data sources in R and cleaned them to ensure that we were only using the most accurate and relevant information (e.g. filtering out users with pre-existing future subscriptions).


For one set of analyses, I ran sets of logistic regressions at the teacher, school, and district levels to examine the effects of different resource assignment rates on the likelihood of future purchases. For the other set, I used Welch’s t-tests to determine if there were significant differences in these assignment rates by type depending on a segmentation parameter we thought would cause differences.



Findings

  • The assignment rate of one resource type was found to positively predict the likelhood that the teacher would have a subscription the next semester.
  • Other resource types also predicted later subscriptions to a lesser degree, but at the higher level (i.e. district vs teacher), the tool from the first finding accounted for most of the predictive power.
  • T-tests revealed that teachers in schools and districts differing on a key segment parameter assigned significantly more of both major types of resources, but also had a significantly greater difference in the assignment rate of these resources.

(Available on request)

Visualizations of logistic regression on each resource type

(Available on request)

Insight excerpt from final report

(Available on request)

Visualization of logistic regression comparison

Simple Tic-Tac-Toe X
Simple Tic-Tac-Toe X
Simple Tic-Tac-Toe X
Simple Tic-Tac-Toe X

Scaffolding Improvement

Usability Testing

@BrainPOP

Project Objective

Ensure that existing product works in the context of a new organization-wide player, and confirm students understand the new choice architecture.

Methodology

Evaluative - Qualitative

Usability Testing

Study Overview

In order to align BrainPOP Science with a new player that would be instituted across all of BrainPOP’s products in 2024., design changes were required, including within the choice architecture of some of the more complex assignment types. Initial testing was done on the early concepts for the redesign, though usability issues were still apparent, in which students were not fully understanding the actions they needed to take in the assignment. Some modifications were made to the design, employing a mmore scaffolded approach to student tutorialization, and here these redesigns were tested to ensure that they addressed the previously found issues.


I advised the product designer for this part of BrainPOP Science (who developed the design and conducted the usability tests) on the research plan, and I conducted the synthesis, analysis, and reporting out of the results. Eight teachers and eight students were recruited to participate and give their feedback on their usage of the product. Some of the key research questions involved whether or not students were interacting with the tool in a way that was amenable to assessment, how teachers reacted to some of the changed choice architecture, and what outputs teachers expected to see from the design changes.

Findings

  • Students engaged in the correct behavior far more often with the revised, scaffolded approach to tutorialization, and teachers also found value in the new approach.
  • Teachers appreciated some of the revised framing of the content, though they were ambivalent about some of the content positioning changes.
  • Some coloration choices were unintuitive to teachers.

Insight summary from final report

Insight excerpt from final report