



Background: What is the use case/scenario?
I'm building a learning app for my daughter based on the Singapore Primary School syllabus (P3 and P4). The goal is to make practice engaging and gamified, while staying grounded in real MOE-aligned content. The app is built around three roles:
- Child: practice, progression, and rewards
- Parent: progress visibility, weekly AI reports, and assigning practice
- Admin: full content management (subjects, topics, questions, lesson units, exam papers)
My first prompt set the entire direction — I started with the actual MOE syllabus structure, so the app never drifted into a generic quiz tool.
What's actually implemented in the app
A Gamified Child Learning Experience (The Core Focus):
- Four Core Subjects: Fully structured learning paths for English, Math, Science, and Chinese
- Learn Before You Practice: A Lesson Player that walks students through concept slides before practice begins — including checkpoint questions mid-lesson
- Subject-Specific Workspaces: A Math Bar Model builder and Drawing Pad for working out problems; a Science Fair-Test Builder and Observation Template for structured science responses; an Oral Recorder for Chinese/English oral practice
- Smart Review Queue: Built on the SM-2 spaced repetition algorithm — the app tracks which questions are due for review and surfaces them daily so nothing is forgotten
- Timed Challenge Mode: A 10-question timed challenge with score multipliers for an extra engagement layer
- Gamification: XP, Coins, Streaks, and 9 achievement Badges tied to real performance milestones. Coins can be spent in the Rewards Shop to unlock pets and UI themes that change the look of the child's portal. Avatars are customisable via DiceBear
A Functional Parent Portal:
- A Mastery Map that breaks down performance by subject → topic → subskill based on real attempt data
- Parent Assignment Engine: Parents can assign specific topics to their child; assigned topics show up as priority cards on the child's dashboard
- Weekly AI Reports: Auto-generated every Sunday, summarising the week's performance, strongest/weakest areas, and 3 recommended actions for the parent
A Working Admin Portal (CMS):
- Question Bank with filters by level, subject, and topic — with JSON preview of prompt and answer data
- AI Content Seeder (
npm run seed:content) that can generate hundreds of questions on demand using any OpenAI-compatible model - Exam Paper Parser: Upload real school exam PDFs; the AI extracts and structures the questions into the database
- Topic and Lesson Builder to manage the curriculum structure and create lesson slides
- Configurable AI Models: Multiple AI providers can be registered and switched via the admin settings panel
Technical highlights:
- Built as a PWA — installable on iPad/phone with offline fallback
- Stack: Next.js 15 (App Router), TypeScript, Tailwind CSS, Supabase (Auth, DB, Storage), OpenAI-compatible AI (DB-driven, not hardcoded)
- 400+ questions seeded across 40 topics
The Process: How did you use SOLO?
SOLO is powerful when you guide it clearly. Detailed instructions up front — especially in the first prompt — let it deliver coherent results from point A to point B. For large projects, breaking work into versioned phases is essential; otherwise context gets lost between stages.
I used SOLO to build the core learning loops, wire up AI grading and spaced repetition, seed real content, and tackle the unglamorous but essential work of making the admin and parent portals genuinely functional.
Caveats / Feedback:
- Sometimes SOLO appears to reference an older commit rather than the latest code, which requires re-prompting
- It would be very helpful if a future version supported persistent custom instructions (e.g. git commit identity, architectural constraints) so you don't have to repeat them every session
- Overall, I'm impressed by the breadth of what it accomplished — features I expected to take weeks were shipped in days