GPT-2 Small Trained on 10 Billion Tokens | Generated by AI

Home 2026.05

It’s GPT-2 small (124M) — not medium.

For reference (you’re not running these):


Back Donate