Difference between revisions of "Generative Pre-trained Transformer (OpenAI GPT) System"

From GM-RKB
Jump to: navigation, search
(ContinuousReplacement)
(Tag: continuous replacement)
(2018a)
(One intermediate revision by the same user not shown)
Line 26: Line 26:
 
** QUOTE: We’ve obtained [[state-of-the-art]] [[result]]s on a suite of diverse [[language task]]s with a [[scalable]], [[task-agnostic system]], which we’re also releasing. [[Our approach]] is a combination of two existing ideas: [[transformer]]s and [[unsupervised pre-training]]. These results provide a convincing example that pairing [[supervised learning]] methods with [[unsupervised pre-training]] works very well; this is an idea that many have explored in the past, and we hope our result motivates further [[research]] into applying this idea on larger and more diverse [[dataset]]s. ...
 
** QUOTE: We’ve obtained [[state-of-the-art]] [[result]]s on a suite of diverse [[language task]]s with a [[scalable]], [[task-agnostic system]], which we’re also releasing. [[Our approach]] is a combination of two existing ideas: [[transformer]]s and [[unsupervised pre-training]]. These results provide a convincing example that pairing [[supervised learning]] methods with [[unsupervised pre-training]] works very well; this is an idea that many have explored in the past, and we hope our result motivates further [[research]] into applying this idea on larger and more diverse [[dataset]]s. ...
  
=== 2018a ===
+
=== 2018d ===
 
* ([[2018_ImprovingLanguageUnderstandingb|Radford et al., 2018]]) ⇒ [[Alec Radford]], [[Karthik Narasimhan]], [[Tim Salimans]], and [[Ilya Sutskever]]. ([[2018]]). “[https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf Improving Language Understanding by Generative Pre-training].”  
 
* ([[2018_ImprovingLanguageUnderstandingb|Radford et al., 2018]]) ⇒ [[Alec Radford]], [[Karthik Narasimhan]], [[Tim Salimans]], and [[Ilya Sutskever]]. ([[2018]]). “[https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf Improving Language Understanding by Generative Pre-training].”  
  
Line 33: Line 33:
 
__NOTOC__
 
__NOTOC__
 
[[Category:Concept]]
 
[[Category:Concept]]
[[Catefory:Machine Learning]]
+
[[Category:Machine Learning]]
 
[[Category:Computational Linguistics]]
 
[[Category:Computational Linguistics]]

Revision as of 04:29, 13 September 2019

An Generative Pre-trained Transformer (OpenAI GPT) System is a left-to-right transformer-based neural Language Modeling system that ...



References

2019

2018c

2018b

2018d