Recursive Autoencoder: Difference between revisions
Jump to navigation
Jump to search
m (Text replacement - ". ----" to ". ----") |
m (Text replacement - ". " to ". ") |
||
Line 8: | Line 8: | ||
=== 2015 === | === 2015 === | ||
* ([[2015_APrimeronNeuralNetworkModelsfor|Goldberg, 2015]]) ⇒ [[Yoav Goldberg]]. ([[2015]]). “[http://u.cs.biu.ac.il/~yogo/nnlp.pdf A Primer on Neural Network Models for Natural Language Processing].” In: Technical Report Journal, October 5, 2015. | * ([[2015_APrimeronNeuralNetworkModelsfor|Goldberg, 2015]]) ⇒ [[Yoav Goldberg]]. ([[2015]]). “[http://u.cs.biu.ac.il/~yogo/nnlp.pdf A Primer on Neural Network Models for Natural Language Processing].” In: Technical Report Journal, October 5, 2015. | ||
** QUOTE: Other [[unsupervised approach]]es, including [[autoencoder]]s and [[Recursive Autoencoder|recursive autoencoder]]s, also fall out of [[scope]]. </s> | ** QUOTE: Other [[unsupervised approach]]es, including [[autoencoder]]s and [[Recursive Autoencoder|recursive autoencoder]]s, also fall out of [[scope]]. </s> | ||
Latest revision as of 13:26, 2 August 2022
A Recursive Autoencoder is an autoencoder that ...
References
2015
- (Goldberg, 2015) ⇒ Yoav Goldberg. (2015). “A Primer on Neural Network Models for Natural Language Processing.” In: Technical Report Journal, October 5, 2015.
- QUOTE: Other unsupervised approaches, including autoencoders and recursive autoencoders, also fall out of scope.
2011
- (Socher et al., 2011) ⇒ Richard Socher, Jeffrey Pennington, Eric H. Huang, Andrew Y. Ng, and Christopher D. Manning. (2011). “Semi-supervised Recursive Autoencoders for Predicting Sentiment Distributions.” In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. ISBN:978-1-937284-11-4
- QUOTE: We introduce a novel machine learning framework based on recursive autoencoders for sentence-level prediction of sentiment label distributions. ...
... Figure 2: Illustration of an application of a recursive autoencoder to a binary tree. ...
... We introduce an approach based on semisupervised, recursive autoencoders (RAE) which use as input continuous word vectors.
- QUOTE: We introduce a novel machine learning framework based on recursive autoencoders for sentence-level prediction of sentiment label distributions. ...