Difference between revisions of "OpenAI GPT-2"

From GM-RKB
Jump to: navigation, search
(Redirected page to OpenAI GPT-2 System)
(Tag: New redirect)
 
Line 1: Line 1:
An [[OpenAI GPT-2 System]] is an [[transformer-based language modeling system]] developed by [[OpenAI]].
+
#REDIRECT [[OpenAI GPT-2 System]]
* <B>See:</B> [[OpenAI GPT]].
 
----
 
----
 
 
 
== References ==
 
 
 
=== 2019 ===
 
* https://openai.com/blog/better-language-models/
 
** QUOTE: Our model, called [[GPT-2]] (a successor to [[GPT]]), was trained simply to [[predict the next word]] in 40GB of [[Internet text]]. Due to our concerns about malicious applications of the technology, we are not releasing the trained model. As an experiment in responsible disclosure, we are instead releasing a much smaller model for researchers to experiment with, as well as a technical paper. <P> [[GPT-2]] is a large [[transformer-based language model]] with 1.5 billion parameters, trained on a dataset of 8 million web pages (the dataset which emphasizes diversity of content, by scraping content from the Internet. In order to preserve document quality, we used only pages which have been curated/filtered by humans—specifically, we used outbound links from Reddit which received at least 3 karma. This can be thought of as a heuristic indicator for whether other users found the link interesting (whether educational or funny), leading to higher data quality than other similar datasets, such as CommonCrawl.).  ...  [[GPT-2]] is a direct scale-up of [[GPT]], with more than 10X the parameters and trained on more than 10X the amount of data.
 
 
 
----
 
__NOTOC__
 
[[Category:Concept]]
 

Latest revision as of 05:00, 14 August 2019