Training and running a neural network on a 1982 BBC Micro — first in BBC BASIC, then in 6502 assembly. Fixed-point Q4.11 arithmetic, a sigmoid lookup table, and a shift-and-add multiplier all prove that backpropagation doesn't need a FPU, a GPU, or anything invented after 1985.