2015 ImprovedSemanticRepresentations

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Long Short-Term Memory (LSTM) Network, Tree-LSTM, Tree-RNNs.

Notes

Cited By

2015

Quotes

Abstract

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks. The only underlying LSTM structure that has been explored so far is a linear chain. However, natural language exhibits syntactic properties that would naturally combine words to phrases. We introduce the Tree-LSTM, a generalization of LSTMs to tree-structured network topologies. Tree-LSTMs outperform all existing systems and strong LSTM baselines on two tasks: predicting the semantic relatedness of two sentences (SemEval 2014, Task 1) and sentiment classification (Stanford Sentiment Treebank).

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2015 ImprovedSemanticRepresentationsChristopher D. Manning
Raquel Urtasun
Richard Socher
Ruslan Salakhutdinov
Kai Sheng Tai
Richard S. Zemel
Ryan Kiros
Yukun Zhu
Antonio Torralba
Sanja Fidler
Improved Semantic Representations from Tree-structured Long Short-term Memory Networks