StepGuide
Role
Product Designer
Team
1 PD, 1 PM, 2 engineers
Timeline
Dec'23-Mar'24
Overview
What is VoyageMath?
VoyageMath is a classroom-first, gamified math practice platform for middle schoolers. Students climb a mountain (their "voyage") by solving problems, unlocking rewards, and building confidence along the way.
What is StepGuide?
StepGuide is VoyageMath's breakthrough learning feature. When students get stuck, it breaks down complex problems into digestible, interactive steps—scaffolding their path from confusion to understanding.
Context: Why We Built This
📉 Learning Crisis
Post-COVID, middle schoolers fell 2+ years behind in math. The achievement gap widened dramatically.
😰 Overwhelmed Teachers
1:30 ratio made personalized support impossible. Teachers couldn't reach struggling students.
🚫 Tool Gap
Existing practice tools: too basic (flashcards), too rigid (one-size-fits-all), or no helpful feedback.
🎯 Our Opportunity
VoyageMath's gamified platform had engagement. But students needed help solving problems, not just motivation to try.
Problem Statement
Understanding the Landscape
Before building anything, we playtested leading math practice tools with 20 students, interviewed few teachers, and analyzed Wayground's existing usage data to understand students' behavior in mathematics.
What We Found
65% never clicked "Show Solution" because students saw help as failure. IXL cuts points for hints and DeltaMath shows all steps at once.
"I feel dumb seeing all those steps." Full solutions overwhelmed students and confirmed their fear it was too hard, so they skimmed and moved on.
80% guessed randomly because it was faster than reading long explanations. Learning turned into a guessing game instead of real understanding.
Teachers need tools that reduce "I don't get it" interruptions and clearly show student progress for independent work.
Based on these insights, we framed our challenge:
For Students
How might we reduce fear and build confidence through patient, step-by-step guidance?
For Business
How might we create a high-frequency practice tool that teachers trust and schools adopt at scale?
Approach
Traditional practice tools took two equally bad paths:
Get it wrong → Punishes student or Shows entire texty solution → Overwhelms students → Gives up
What we needed to build instead:
Get it wrong → Break into digestible steps, one at a time → Nail the next one
Our approach was simple: mimic how great teachers teach:
1
Break the problem into manageable pieces
2
Show one step at a time, not everything at once
3
Build on what the student already understood
4
Celebrate small wins along the way
Journey
v1
CHAT STYLE
Six iterations. Each one bringing us closer.
Version 1 started with good intentions. AI-tutor style guidance, helpful hints, encouraging feedback. Students had other ideas.
What we built
Chat-style interface inspired by ChatGPT
Problems broken into 3-4 steps
Hints + MCQs + encouraging feedback
What we learned
Students skipped long context entirely
Hints went unread
Validations disrupted flow
v2
REFINING BASICS
What we changed
Shortened context to one sentence
Highlighted hints visibly
Made step names sticky
What we learned
Students only read hints, skipped main question
Step names completely ignored
v3
STEP FLOW PANE
What we changed
Added left pane showing step flow
Answers carried forward between steps
Full step sequence visible
What we learned
Students ignored the sidepane - anything outside main focus was overlooked
Screen felt noisy and cluttered
v4
CLUTTER REDUCTION
What we changed
Collapsed completed steps
Added visual examples
Cleaner interface
What we learned
Collapsing worked! Students focused better
Examples boosted accuracy
But some confused examples with actual questions
v5
BREAKTHROUGH
What we changed
Dropped chat UI completely
Removed all filler content
Clean pen-and-paper style flow
Visual scaffolding
What we learned
Students FINALLY understood step connections
Could explain full flow after 3-4 problems
This was the breakthrough moment
Just needed one final refinement
v6
THE FINAL VERSION
We removed the example box. That was it—the only change from V5.
In testing, students didn't say "this looks like a lot of work" anymore. They said "wait, that's it?" They stopped skipping. They stopped staring at the screen, paralyzed.
Version 6 didn't feel like homework. It felt like solving a problem with a friend beside you.
What we changed
Removed example box
Kept everything else
What we learned
Universal preference in testing
Intuitive and non-threatening
With the interaction design working, we focused on visual polish. We iterated until StepGuide felt less like software and more like working through problems on paper.
Building the System
We had the design. Now we needed to scale it from 5 skills to 500+ with content designers and subject matter experts.
We established guiding principles, standardized the anatomy of each step, and created guidelines for choosing interactive components.
Guiding Principles
🧱 Scaffolding
Break problems into foundational chunks. Build from simple to complex.
🔗 Continuity
Each step flows into the next. Students see clear connections.
✏️ Replicability
Students can recreate the solution on paper after completing the StepGuide.
📐 Clarity
Each step is a smaller question. Active thinking, not passive reading.
Anatomy of a Step
Every StepGuide follows the same structure, making it predictable and learnable:
Interactive Components
Starting with Fill in the Blanks and Multiple Choice, I established the base interactive components.

FIB
MCQ

The team evolved these into specialized interactions for complex math concepts: factor trees, long division, grids, graphing, number lines, and more.

Factor trees
Graphing
Long division
Grids

Behavioral Insight
After launch, we observed three distinct behavioral patterns:
50%
Skippers
Never used StepGuide. Saw it as an interruption.
30%
Burnt
Did at least one StepGuide, skipped the rest.
20%
Religious
Used it consistently. These were rare.
These 50% Skippers were the problem. They never gave StepGuide a chance.
Why skip? Students did the math:
Guessing was faster, easier, and could earn more reward than learning.
Aspect
Current Incorrect Question
(Using StepGuide)
Moving to Next Question (Skipping)
Effort Required
Higher (Several Steps to Correct Mistake)
Lower
Mastery Score Impact
None (Current Setting)
Possible Gain if Guess is Correct
Step Rewards
Half the Steps Compared to Correct Initial Answer
Full Steps if Answer is Correct
Learning Potential
Higher (Detailed Understanding of Mistake)
Lower (Missed Learning Opportunity)
Question Limit Awareness
Not Apparent (Students think questions are unlimited)
Possible Gain if Guess is Correct
The Fix
We repositioned StepGuide from punishment to opportunity:
Mountain hidden
No visible rewards
Felt like consequences
50% skip rate
Mini mountain visible
Bonus steps shown
Framed as "reattempt"
20% drop in skip rate
Impact
500k+
StepGuide completed
30%
accuracy boost after 2–3 StepGuides attempts
Stats are cool. Student reactions are better:
What Stuck With Me
Great design isn't just about craft. It's about understanding what drives behavior. Motivation matters more than interface perfection. Failure reveals insights research can't. And scaling to 500+ skills requires thinking in systems, not just screens.
Most importantly: question your assumptions early and often. Every version I built was based on what I thought students needed. Only by watching them use (or skip) it did I discover what they actually needed.












