1. The Familiar Training Step 2. Why Call loss.backward() At All 3. Why An Update Actually Helps 4. Values Remember Where They Came From 5. What One Operation Does During Backward 6. How loss.backward() Walks the Whole Graph 7. So We Have All the Gradients in the Graph. What’s Next?...