tinytorch/milestones/03_1986_mlp/README.md
The 1969 XOR crisis had killed neural network research. Then in 1986, Rumelhart, Hinton, and Williams published "Learning representations by back-propagating errors," showing that:
This paper ended the AI Winter and launched modern deep learning. Now it's your turn to recreate that breakthrough using YOUR Tinyπ₯Torch!
Multi-layer perceptrons (MLPs) on real image classification tasks:
Run after Module 08 (Full training pipeline with data loading)
<table> <thead> <tr> <th width="25%"><b>Module</b></th> <th width="25%">Component</th> <th width="50%">What It Provides</th> </tr> </thead> <tbody> <tr><td><b>Module 01</b></td><td>Tensor</td><td>YOUR data structure with autograd</td></tr> <tr><td><b>Module 02</b></td><td>Activations</td><td>YOUR ReLU activation</td></tr> <tr><td><b>Module 03</b></td><td>Layers</td><td>YOUR Linear layers</td></tr> <tr><td><b>Module 04</b></td><td>Losses</td><td>YOUR CrossEntropyLoss</td></tr> <tr><td><b>Module 05</b></td><td>DataLoader</td><td>YOUR batching and data pipeline</td></tr> <tr><td><b>Module 06</b></td><td>Autograd</td><td>YOUR automatic differentiation</td></tr> <tr><td><b>Module 07</b></td><td>Optimizers</td><td>YOUR SGD optimizer</td></tr> <tr><td><b>Module 08</b></td><td>Training</td><td>YOUR end-to-end training loop</td></tr> </tbody> </table>This milestone uses progressive scaling with 2 scripts:
Purpose: Prove MLPs work on real images (fast iteration)
Why TinyDigits First?
Purpose: Scale to the classic benchmark
Historical Note: MNIST (1998) became THE benchmark for evaluating learning algorithms. MLPs hitting 95%+ proved neural networks were back!
MLPs don't just memorize - they learn useful internal representations:
Hidden Layer Discovers:
This is representation learning - the foundation of deep learning's power.
Why This Matters:
cd milestones/03_1986_mlp
# Step 1: Quick validation on TinyDigits (run after Module 08)
python 01_rumelhart_tinydigits.py
# Step 2: Scale to MNIST benchmark (run after Module 08)
python 02_rumelhart_mnist.py
After completing this milestone, you'll understand:
You've recreated the breakthrough that ended the AI Winter!
Note for Next Milestone: MLPs treat images as flat vectors, ignoring spatial structure. Milestone 04 (CNN) will show why convolutional layers dramatically improve image recognition!