This work is owned by BrainPOP, and I was involved as part of a team in creating this work while working with BrainPOP.
AI-Assisted Grading
Usability Testing
@BrainPOP
Project Objective
Gauge teachers’ first impressions of an AI-assisted grading product, including attitudes about and understanding of the offering.
Methodology
Evaluative - Qualitative
Usability Testing
A/B Testing
User Interviews
Study Overview
After the development of a proprietary AI “assisted grader” to help teachers score submissions of one of BrainPOP Science’s core assignment types, we needed to test the overall design as well as compare two prototypes under consideration. The primary difference in the prototypes was the degree to which the AI’s rationale would be shown to teachers, though we were also interested in assessing teacher’s attitudes about AI use, their understanding of what the tool was doing, their comprehension of how to use it, and their thoughts on how well it would help them.
I worked primarily with the UX designer who created the prototypes, and we drafted a set of research questions getting at the above concerns and a script for usability testing, interviews, and A/B testing. Eight teachers participated in the study, and the designer and I collabroated to synthesize, interpret, and report the insights.
Findings
Insight excerpt from final report
Overview from final report
Unit Restructure
Usability Testing
@BrainPOP
Project Objective
Determine whether the UI changes made to the organizations of the BrainPOP Science product make it easy for teachers to find what they need.
Methodology
Evaluative - Qualitative
Usability Tests
A/B Tests
Study Overview
After lots of teacher feedback an on the initial designs of the BrainPOP Science product and competitive research comparing it against similar products, it was clear users wanted a different method of content organization. Primarily, our units appeared to be grouped into categories that were needlessly large and difficult to navigate between.
After others on my team completed initial concept testing to determine the primary direction to be taken, I embarked on this project with one of the UX designers on my team to further refine the changes. Here, we tested three different prototypes she created that slightly varied in their approaches to this content restructure, including changes to the type of information presented at the top of the page and some of the thumbnail choices for certain resources. We recruited eight teachers, conducted usability tests and A/B tests with the three prototypes, and worked together to synthesize the findings.
Findings
Alignment to Core
Needfinding Interviews
@BrainPOP
Project Objective
Position BrainPOP Science as a valuable standalone product by determining how teachers are aligning supplemental science products with their core science curricula, what the greatest gaps in these core curricula are, and the factors that go into purchasing supplemental science product decisions.
Methodology
Generative - Qualitative
User Interviews
Research questions from the project proposal
Study Overview
BrainPOP Science, as a science supplemental product, was created to assist teachers in science instruction alongside the “core curricula” products they are using, often as a mandate by their school or district. Critical to positioning BrainPOP Science as a point solution was determining where teachers’ core curricula were fallling short, what jobs teachers were trying to get done outside of their core curricula, and what factors were typically being considered by administrators making purchasing decisions about such products. I developed two in-depth interview scripts for teachers and administrators and recruited eight people in each role to participate.
For teachers, I employed a Jobs-to-be-Done approach throughout, tailoring my questions to focus on the specific goals teachers were trying to accomplish in their specific circumstances when deciding to use a science supplemental product. I also engaged them in a think aloud, in which I asked them to pull up one of their science supplemental products and walk me through their process of selecting resources to use in their classroom.
For administrators, in addition to changing the nature of some of the general interview questions, I employed a retrospective behavioral analyses, in which I asked them to walk me through all of the steps they took during the purchase of their last science supplement, starting from the very first moment they realized they needed a product that accomplished what it did.
I synthesized the data with thematic coding and determined the most common core product issues, supplement use cases, and purchase decisions to share out with our team.
Excerpt from the teacher interview guide
Findings
Insight excerpt from final report
Opportunities and recommendations section from final report
Teacher Usage Data
Exploratory Quantitative Analysis
@BrainPOP
Project Objective
Investigate potential factors leading to increased BrainPOP Science usage and identify usage patterns that may be predictive of future purchase decisions.
Methodology
Evaluative - Quantitative
Exploratory statistical analyses
Research questions from the project proposal
Study Overview
This study was conducted as a part of a larger ongoing effort to understand how teachers are implementing BrainPOP Science in the classroom. Previous analyses suggested that user may be segmented based on their assignment rates of specific types of learning resources, so I investigated teacher usage data from the Spring and Fall Semesters of 2023 to see if assignment rates of these resources were related to purchase decisions, and I examined whether our training programs were influencing how teachers were using the product.
After formulating discrete and specific research questions and defining the variables to be used in the analyses, I combined multiple data sources in R and cleaned them to ensure that we were only using the most accurate and relevant information (e.g. filtering out users with pre-existing future subscriptions).
For one set of analyses, I ran sets of logistic regressions at the teacher, school, and district levels to examine the effects of different resource assignment rates on the likelihood of future purchases. For the other set, I used Welch’s t-tests to determine if there were significant differences in these assignment rates by type depending on a segmentation parameter we thought would cause differences.
Findings
(Available on request)
Visualizations of logistic regression on each resource type
(Available on request)
Insight excerpt from final report
(Available on request)
Visualization of logistic regression comparison
Scaffolding Improvement
Usability Testing
@BrainPOP
Project Objective
Ensure that existing product works in the context of a new organization-wide player, and confirm students understand the new choice architecture.
Methodology
Evaluative - Qualitative
Usability Testing
Study Overview
In order to align BrainPOP Science with a new player that would be instituted across all of BrainPOP’s products in 2024., design changes were required, including within the choice architecture of some of the more complex assignment types. Initial testing was done on the early concepts for the redesign, though usability issues were still apparent, in which students were not fully understanding the actions they needed to take in the assignment. Some modifications were made to the design, employing a mmore scaffolded approach to student tutorialization, and here these redesigns were tested to ensure that they addressed the previously found issues.
I advised the product designer for this part of BrainPOP Science (who developed the design and conducted the usability tests) on the research plan, and I conducted the synthesis, analysis, and reporting out of the results. Eight teachers and eight students were recruited to participate and give their feedback on their usage of the product. Some of the key research questions involved whether or not students were interacting with the tool in a way that was amenable to assessment, how teachers reacted to some of the changed choice architecture, and what outputs teachers expected to see from the design changes.
Findings
Insight summary from final report
Insight excerpt from final report