nanoGPT System

From GM-RKB
Jump to navigation Jump to search

A nanoGPT System is a transformer-based NLM training system.

  • Context:
    • It can reproduce GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in about 4 days of training.
    • ...
  • See: minGPT.


References

2023

  • https://github.com/karpathy/nanoGPT
    • QUOTE: The simplest, fastest repository for training/finetuning medium-sized GPTs. It is a rewrite of minGPT that prioritizes teeth over education. Still under active development, but currently the file train.py reproduces GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in about 4 days of training. The code itself is plain and readable: train.py is a ~300-line boilerplate training loop and model.py a ~300-line GPT model definition, which can optionally load the GPT-2 weights from OpenAI. That's it.