tinytorch/site/tito/milestones.md
Purpose: The milestone system lets you run famous ML algorithms (1958-2018) using YOUR implementations. Every milestone validates that your code can recreate a historical breakthrough.
The milestone system lets you run famous ML algorithms using YOUR implementations.
Milestones are runnable recreations of historical ML papers that use YOUR TinyTorch implementations:
Each milestone script imports YOUR code from the TinyTorch package you built.
Typical workflow:
# 1. Build the required modules (e.g., Foundation Tier for Milestone 03)
tito module complete 01 # Tensor
tito module complete 02 # Activations
tito module complete 03 # Layers
tito module complete 04 # Losses
tito module complete 05 # DataLoader
tito module complete 06 # Autograd
tito module complete 07 # Optimizers
tito module complete 08 # Training
# 2. See what milestones you can run
tito milestone list
# 3. Get details about a specific milestone
tito milestone info 03
# 4. Run it!
tito milestone run 03
List All Milestones
tito milestone list
Shows all 6 historical milestones with status:
Simple View (compact list):
tito milestone list --simple
Get Detailed Information
tito milestone info 03
Shows:
Run a Milestone
tito milestone run 03
What happens:
Skip prerequisite checks (not recommended):
tito milestone run 03 --skip-checks
View Milestone Progress
tito milestone status
Shows:
Visual Timeline
tito milestone timeline
See your journey through ML history in a visual tree format.
</div>What: Frank Rosenblatt's first trainable neural network
Requires: Modules 01-03 (Tensor, Activations, Layers)
What you'll do: Implement and train the perceptron that proved machines could learn
Historical significance: First demonstration of machine learning
Run it:
tito milestone info 01
tito milestone run 01
What: Demonstrating the problem that stalled AI research
Requires: Modules 01-03 (Tensor, Activations, Layers)
What you'll do: Experience how single-layer perceptrons fail on XOR - the limitation that triggered the "AI Winter"
Historical significance: Minsky & Papert showed perceptron limitations; this milestone demonstrates the crisis before the solution
Run it:
tito milestone info 02
tito milestone run 02
What: Backpropagation breakthrough - train deep networks on MNIST
Requires: Modules 01-08 (Complete Foundation Tier)
What you'll do: Train a multi-layer perceptron to recognize handwritten digits (95%+ accuracy)
Historical significance: Rumelhart, Hinton & Williams (Nature, 1986) - the paper that reignited neural network research
Run it:
tito milestone info 03
tito milestone run 03
What: LeNet - Computer Vision Breakthrough
Requires: Modules 01-09 (Foundation + Convolutions)
What you'll do: Build LeNet for digit recognition using convolutional layers
Historical significance: Yann LeCun's breakthrough that enabled modern computer vision
Run it:
tito milestone info 04
tito milestone run 04
What: "Attention is All You Need"
Requires: Modules 01-08 + 11-13 (Foundation + Embeddings, Attention, Transformers)
What you'll do: Implement transformer architecture with self-attention mechanism
Historical significance: Vaswani et al. revolutionized NLP and enabled GPT/BERT/modern LLMs
Run it:
tito milestone info 05
tito milestone run 05
What: Production ML Systems
Requires: Modules 01-08 + 14-19 (Foundation + Optimization Tier)
What you'll do: Optimize for production deployment with quantization, compression, and benchmarking
Historical significance: MLPerf standardized ML system benchmarks for real-world deployment
Run it:
tito milestone info 06
tito milestone run 06
Each milestone requires specific modules to be complete. The run command automatically validates:
Module Completion Check:
tito milestone run 03
Checking prerequisites for Milestone 03...
Module 01 - complete
Module 02 - complete
Module 03 - complete
Module 04 - complete
Module 05 - complete
Module 06 - complete
Module 07 - complete
Module 08 - complete
All prerequisites met!
Import Validation:
Testing YOUR implementations...
Tensor import successful
Activations import successful
Layers import successful
YOUR TinyTorch is ready!
You'll see a helpful error:
Missing Required Modules
Milestone 03 requires modules: 01, 02, 03, 04, 05, 06, 07, 08
Missing: 06, 07, 08
Complete the missing modules first:
tito module start 06
tito module start 07
tito module start 08
When you successfully complete a milestone, you'll see:
╔════════════════════════════════════════════════╗
║ Milestone 03: MLP Revival (1986) ║
║ Backpropagation Breakthrough ║
╚════════════════════════════════════════════════╝
MILESTONE ACHIEVED!
You completed Milestone 03: MLP Revival (1986)
Backpropagation Breakthrough
What makes this special:
• Every line of code: YOUR implementations
• Every tensor operation: YOUR Tensor class
• Every gradient: YOUR autograd
Achievement saved to your progress!
What's Next:
Milestone 04: CNN Revolution (1998)
Unlock by completing module: 09
TinyTorch tracks progress in three ways (all are related but distinct):
<div style="background: #f8f9fa; padding: 1.5rem; border-radius: 0.5rem; margin: 1rem 0;">1. Module Completion (tito module status)
.tito/progress.json2. Milestone Achievements (tito milestone status)
.tito/milestones.json3. Overall Status
tito module status and tito milestone statusComplete Modules (01-08)
↓
Unlock Milestone 03
↓
Run: tito milestone run 03
↓
Achievement Recorded
↓
Capability Unlocked (optional checkpoint system)
While you can technically skip around, the tier structure is designed for progressive learning:
Before running a milestone, make sure your modules work:
# After completing a module
tito module complete 05
# Test it works
python -c "from tinytorch import Tensor; print(Tensor([[1,2]]))"
Learn what you're about to do:
tito milestone info 03 # Read the context first
tito milestone run 03 # Then run it
Share your milestones! Each one represents recreating a breakthrough that shaped modern AI.
Problem: Module not exported or import failing
Solution:
# Re-export the module
tito module complete XX
# Test import manually
python -c "from tinytorch import Tensor"
Problem: Progress not tracked correctly
Solution:
# Check module status
tito module status
# If modules show incomplete, re-run complete
tito module complete XX
Problem: Bug in your implementation
Solution:
modules/XX_name/XX_name.ipynb (NOT tinytorch/)tito module complete XXEvery milestone uses YOUR code. Every achievement is proof you understand ML systems deeply. Build from scratch, recreate history, master the fundamentals.