GPT-2 (nonfiction)
Jump to navigation
Jump to search
GPT-2 (Generative Pretrained Transformer 2) is a language model that was trained on 40GB of text scraped from websites that Reddit linked to and that had a Karma score of at least two.
The developers at OpenAI describe GPT-2 as "a large-scale unsupervised language model which generates coherent paragraphs of text, achieves state-of-the-art performance on many language modeling benchmarks, and performs rudimentary reading comprehension, machine translation, question answering, and summarization—all without task-specific training."
Because the model is probabilistic, it returns a different response every time you enter the same input.
In the News
Fiction cross-reference
Nonfiction cross-reference
External links: