2015 RecurrentConvolutionalNeuralNet

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Neural Text Classification.

Notes

Cited By

Quotes

Abstract

Text classification is a foundational task in many NLP applications. Traditional text classifiers often rely on many human-designed features, such as dictionaries, knowledge bases and special tree kernels. In contrast to traditional methods, we introduce a recurrent convolutional neural network for text classification without human-designed features. In our model, we apply a recurrent structure to capture contextual information as far as possible when learning word representations, which may introduce considerably less noise compared to traditional window-based neural networks. We also employ a max-pooling layer that automatically judges which words play key roles in text classification to capture the key components in texts. We conduct experiments on four commonly used datasets. The experimental results show that the proposed method outperforms the state-of-the-art methods on several datasets, particularly on document-level datasets.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2015 RecurrentConvolutionalNeuralNetJun Zhao
Kang Liu
Siwei Lai
Liheng Xu
Recurrent Convolutional Neural Networks for Text Classification2015