A minimal GPT implementation written from scratch in pure Go, trained on Jules Verne books. The project prioritizes educational clarity over performance by avoiding batches and external dependencies like gonum. It includes comprehensive explanations from basic neurons to self-attention mechanisms, with git tags showing model evolution stages. The implementation takes about 40 minutes to train on MacBook Air M3 and serves as a practical companion to neural network learning courses.

Sort: