A special education coordinator in a mid-size Florida school district sent me an email that was three paragraphs of frustration. Her teachers were drowning in paperwork. Not lesson planning. Not differentiated instruction. Paperwork.
Specifically, IEP progress monitoring documentation. Every student with an Individualized Education Program has goals. Every goal has measurable benchmarks. Every benchmark requires regular progress notes. Every progress note must reference specific observations, data points, and the instructional strategies used.
For a special education teacher with 18 students, each with 4 to 6 IEP goals, that is roughly 90 progress notes per reporting period. Written individually. Reviewed by the coordinator. Filed in compliance folders. Each one takes 4 to 8 minutes when done properly.
The math: 90 notes times 6 minutes average equals 9 hours per reporting period per teacher. With monthly reporting requirements and 12 special education teachers in the district, that is 108 hours of documentation per month.
What the tool does
The tool does not write IEP progress notes. This is a critical distinction. The tool organizes the data that teachers collect so the progress note writes itself.
Here is the workflow before the tool: A teacher observes a student practicing a skill. She makes a mental note or scribbles on a sticky note. At the end of the week, she sits down and tries to remember what she observed, translates it into IEP-compliant language, connects it to the specific goal and benchmark, and writes a progress note.
Here is the workflow with the tool: The teacher opens the app on her phone during or immediately after the observation. She selects the student, selects the goal, and speaks or types a quick note: "Marcus read 47 words per minute today on the grade-level passage. He self-corrected twice. Yesterday was 42 words." That takes 20 seconds.
At the end of the reporting period, the tool compiles all observations for each student and goal into a formatted progress note. It includes the data trend (improving, maintaining, regressing), references the specific observations with dates, and uses the IEP-compliant language framework the district requires.
The teacher reviews the compiled note, edits anything that needs context the tool does not have, and approves it. Review and edit takes 2 minutes instead of 6 minutes of writing from scratch.
Why the data entry point matters
The biggest insight from this project was not the tool itself. It was the shift from retrospective documentation to real-time data capture.
When teachers document from memory at the end of the week, they lose specificity. "Marcus is making progress in reading fluency" is what you write when you cannot remember Tuesday's numbers. "Marcus increased from 42 to 47 words per minute between Tuesday and Thursday, self-correcting twice on the Thursday passage" is what you write when the data was captured in the moment.
The second note is more useful for instruction, more compliant with IEP requirements, and more defensible if the district ever faces a due process complaint. And it was easier to produce because the teacher spent 20 seconds capturing data instead of 6 minutes reconstructing it.
The compliance layer
IEP documentation has legal requirements under IDEA (Individuals with Disabilities Education Act). Progress notes must be factual, data-based, and connected to the student's specific goals. They must be produced on schedule. They must be available to parents on request.
The tool enforces these requirements structurally. It will not generate a progress note without at least two data points per goal per reporting period. If a goal has no observations logged, the tool flags it as "insufficient data" and alerts the teacher and coordinator before the reporting deadline.
This is the governance piece that most education technology misses. The tool does not just make documentation easier — it makes compliance failures visible before they become violations.
The CLAUDE.md configuration for this project includes rules that matter: never fabricate observation data, never infer student performance that was not directly observed, never use language that implies a diagnosis or prognosis. The tool reports what was observed. The educator interprets what it means.
What the teachers said
Resistance was lower than expected because we involved three teachers in the build process. They reviewed the output at each stage and told us when the language did not match how they actually document.
One teacher pointed out that the tool's initial output used the phrase "the student demonstrated" in every note, which sounds robotic and repetitive in a file with 6 goals. We adjusted the template to vary the language naturally. A small thing, but it made the difference between output that teachers accepted and output they would rewrite entirely.
After the first reporting period, all 12 teachers were using the tool. Documentation time dropped from 9 hours to approximately 3.5 hours per teacher per reporting period. More importantly, the quality of the documentation improved because it was based on real-time observations instead of end-of-week memories.
The cost and timeline
Four days of build time. The tool runs on the district's existing infrastructure. No per-student licensing fees. No vendor contract. The district owns the code and can modify it as requirements change.
The coordinator estimates the tool saves approximately 65 hours of teacher time per month across the district. That is time that goes back to instruction, parent communication, and the collaborative planning that actually improves student outcomes.
What this means for your district
If your special education teachers are spending hours on documentation that could be captured in seconds, you have the same opportunity. The requirements are straightforward: a way to capture observations in real time, a rules engine that enforces your documentation standards, and a compilation process that turns observations into compliant progress notes.
The build is measured in days. The training is measured in hours. The time savings start in the first reporting period.
Get posts like this in your inbox
No spam. New articles on AI strategy, governance, and building with AI for small business.
Keep Reading
We're Launching a Summer Internship Program — And We Want You
uCreateWithAI is accepting applications for our Summer 2026 AI Implementation Intern Program. 12 weeks. 3 real-world projects. Your own AI build. No tuition. Here's everything you need to know.
AI Tools Are Five-Year-Olds With PhDs
Claude, ChatGPT, Gemini — they all have one thing in common. If you compared them to people, you'd say they were five-year-olds with a PhD. Amazing in the books, but not street-smart.
The franchise opportunity in AI education: why the timing is now
Market timing, infrastructure, unit economics, and defensibility. The business case for an AI training franchise.