Lessons that compile.
Math that checks out.
A single Rust binary that turns a topic prompt into a typeset, dimensionally-consistent .pdf lesson — Typst for rendering, all-Rust verification for correctness, OpenRouter for the words.
- ~ms
- verification per exercise
- 0
- external runtimes
- 1
- static binary
- 7-vec
- SI dimensional algebra
Why bother
The model writes. The runtime verifies.
Most "AI lesson generators" are wrappers that hope the model is smart enough. This one treats every lesson as a small DAG of jobs with mechanical gates between them — no reviewer model, no hand-curated facts file, no Python sidecar.
Typst, in-process
Real PDF compilation via the typst crate — vendored templates and embedded fonts, no shelling out, no LaTeX, no network at compile time.
All-Rust verification
Pratt parser + f64 evaluator. Every exercise gets numerically substituted at 16 deterministic samples — sign errors and lost factors die instantly.
Dimensional analysis
A 7-vector SI algebra walks every formula tagged dimensional: F = m/a fails before it ever reaches a reader.
Multi-stage pipeline
plan → fan-out write_section → compile-as-a-gate → verify → assemble. Each stage is its own job kind: retries, backoff, panic isolation come for free.
Queue groups
Network-bound LLM jobs and CPU-bound compile jobs live in separate worker pools, with a per-lesson fairness cap so one big lesson can’t starve the others.
Deterministic
Model, prompt version, seed, and raw stage I/O persist on every lesson. POST /lessons/:id/regenerate is bit-stable.
Pipeline
Five stages.
Two pools.
Each stage is a job kind. The job table is the orchestrator — durable, retryable, fair. LLM and CPU work live in separate pools so a slow OpenRouter call can never block a Typst compile.
- llm #1plan_lesson
outline + section scaffolding
- llm #2write_section
fan-out, one per section
- cpu #3compile_section
Typst hard gate
- cpu #4verify_section
numeric + dimensional
- cpu #5assemble_lesson
final PDF + .typ
Each section's Typst is compiled before the section is allowed to merge. Compile errors feed the diagnostic back into a targeted retry — broken math never reaches the assembled PDF.
parent_id + children_remaining on the jobs table. Each completion decrements the parent in the same tx; when it hits zero the parent flips back to pending. No polling.
The claim query refuses to run more than N jobs that share a root_parent_id. Multiple lessons interleave; one big course can't starve the rest.
Verification
Schwartz–Zippel,
in microseconds.
Every exercise carries a verifiable answer. A small in-process Pratt evaluator substitutes both sides of the claimed identity at 16 deterministic random points — agreement at all 16 means equality with probability approaching one.
- Numeric equality, polynomial identity, root-set verification.
-
Diff(expr, var, h)pseudo-function for derivatives via central differences. - 7-vector SI dimensional analysis on every formula tagged
dimensional: true. - All in-process. No sympy subprocess, no JIT, no network call per check.
Every section declares defines and requires. A pure-Rust check rejects
any outline where a section requires a symbol no earlier section defines —
before a single writer call is made.
API
One POST.
One PDF.
Submit a topic, watch the pipeline drive itself through Server-Sent Events, download the rendered PDF when it's done. Regeneration is deterministic on the stored seed.
curl -X POST http://localhost:8080/lessons \
-H 'content-type: application/json' \
-d '{
"topic": "derivatives via the limit definition",
"level": "intro",
"domain": "math",
"lang": "en",
"target_pages": 6
}'
← 201 Created
{
"id": "01J5K2A8...PTQ",
"root_job_id": "01J5K2A8...XQZ"
}GET /lessons GET /lessons/:id GET /lessons/:id/typ POST /lessons/:id/regenerateStack
No Python.
No LaTeX. No drama.
Everything ships as one Rust binary plus an SQLite file. The only network calls leave the box for OpenRouter and only when a writer needs a model.
Boot the binary.
Watch a lesson typeset itself.
Open source, Apache + MIT, single binary. cargo run, point it at OpenRouter, then drive the pipeline from the dashboard or POST a topic.