Educational GPT implementation in ~300 lines. Reproduces GPT-2 (124M) on OpenWebText. Clean, hackable code for learning transformers. By Andrej Karpathy. Perfect for understanding GPT architecture from scratch. Train on Shakespeare (CPU) or OpenWebText (multi-GPU).
/plugin marketplace add zechenzhangAGI/AI-research-SKILLs/plugin install nanogpt@zechenzhangAGI/AI-research-SKILLs