tinytorch/paper/organizational_insights.md
This document summarizes key organizational decisions and learnings from TinyTorch's development history that inform the paper's discussion of curriculum design and infrastructure.
Evolution: Initially developed with Jupyter notebooks as primary format, evolved to Python source files (.py) as source of truth.
Key Decision:
modules/NN_name/name_dev.py (Python files with Jupytext percent format).ipynb files generated via tito nbgrader generate for student assignments.ipynb files excluded from version control during developmentRationale:
.py) from student-facing (.ipynb) enables clean workflowPaper Relevance: This workflow decision supports the "professional development practices" claim in Section 4 (Package Organization). The Python-first approach enables students to experience real software engineering workflows while learning ML systems.
Evolution: Initially used separate test files in tests/ directory, evolved to inline testing within modules with complementary integration tests.
Key Decision:
*_dev.py files, executed immediately when module runstests/integration/ directory for cross-module validationRationale:
Evidence from History:
Paper Relevance: This testing philosophy supports Section 4's discussion of "Integration Testing Beyond Unit Tests." The dual-testing approach (inline + integration) addresses the pedagogical challenge of validating both isolated correctness and system composition.
Evolution: Modules initially varied in structure, evolved to standardized template based on 08_optimizers as reference implementation.
Key Decision:
modules/08_optimizers/optimizers_dev.py serves as canonical examplemodule.yaml files standardize module configurationRationale:
Evidence from History:
docs/development/module-rules.md codifies standardsPaper Relevance: This standardization supports Section 3's discussion of "Module Structure" and demonstrates how curriculum design principles (cognitive load management) translate to concrete implementation patterns.
Evolution: Package structure evolved to mirror PyTorch's organization (tinytorch.core, tinytorch.nn, tinytorch.optim) enabling progressive imports.
Key Decision:
from tinytorch.nn import Linear after Module 03core, nn, optim, data, profiling) for transfer learning#| export directives and #| default_exp targets enable automated package generationRationale:
Evidence from History:
Paper Relevance: This package organization directly supports Section 4's "Package Organization" subsection and the claim that "students build a working framework progressively, not isolated exercises."
Evolution: Recognized that unit tests alone insufficient; added dedicated integration test suite for cross-module validation.
Key Decision:
tests/integration/test_gradient_flow.py validates gradients flow through entire training stackRationale:
Evidence from History:
tests/README.md explains integration test purposePaper Relevance: This directly supports Section 3's "Use: Integration Testing Beyond Unit Tests" and demonstrates how curriculum design addresses the pedagogical challenge of validating system composition.
Evolution: Modules organized into Foundation (01-08), Architecture (09-13), Optimization (14-19), Olympics (20) tiers.
Key Decision:
Rationale:
Evidence from History:
docs/development/MODULE_ABOUT_TEMPLATE.md includes tier metadataPaper Relevance: This tier organization is central to Section 3's curriculum architecture discussion and supports the claim that "students build on solid foundations."
Evolution: Integrated NBGrader (assessment) with NBDev (package export) to create unified development → assessment → package workflow.
Key Decision:
nbgrader metadata for automated grading#| export directives enable package generation from notebookstito nbgrader generate creates student assignments, tito module complete exports to package### BEGIN SOLUTION / ### END SOLUTION blocks hide implementations from studentsRationale:
Evidence from History:
docs/development/module-rules.md details NBGrader integrationtito CLI integrates both tools seamlesslyPaper Relevance: This workflow supports Section 4's "Automated Assessment Infrastructure" discussion and demonstrates how curriculum design integrates assessment with learning.
Iterative Design: TinyTorch's organization evolved through practical use, not upfront design. This suggests curriculum design benefits from iterative refinement based on student feedback and implementation challenges.
Pedagogical Principles Drive Technical Decisions: Every organizational decision (Python-first, inline testing, package structure) serves pedagogical goals (cognitive load management, immediate feedback, transfer learning).
Professional Standards Enable Learning: Using industry-standard tools (Git, NBGrader, NBDev) doesn't complicate learning—it prepares students for professional practice while maintaining educational focus.
Integration Testing as Pedagogical Tool: Integration tests don't just catch bugs—they teach interface design and system thinking. This represents a curriculum design insight: assessment infrastructure can be educational.
Flexibility Through Structure: Standardized module structure enables flexible deployment (tier configurations) while maintaining consistency. Structure enables, rather than constrains, pedagogical adaptation.
Section 4 (Course Deployment) could include:
Section 3 (TinyTorch Architecture) could expand:
New Subsection: "Curriculum Evolution Through Implementation" discussing how organizational decisions emerged from practical challenges rather than upfront design, representing a design pattern for educational framework development.
Should we add explicit discussion of organizational evolution? The paper currently describes TinyTorch's current state but doesn't discuss how it evolved. Adding this could strengthen the "design patterns" contribution.
How much technical detail about workflow? The Python-first workflow and NBGrader integration are mentioned but not detailed. Should we expand these discussions?
Integration testing as pedagogical innovation? The dual-testing approach (inline + integration) seems like a curriculum design contribution worth highlighting more explicitly.
Tier flexibility as deployment pattern? The three-tier architecture with flexible configurations represents a deployment pattern that could be emphasized more in Section 4.
Reference implementation pattern? Using 08_optimizers as canonical example represents a curriculum maintenance pattern that could be discussed.