Neural Text Classification Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
No edit summary
 
Line 1: Line 1:
A [[Neural Text Classification Algorithm]] is a [[supervised text classification algorithm]] that is a [[neural classification algorithm]].
#REDIRECT [[Neural Network-based Text Classification Algorithm]]
* <B>See:</B> [[SVM-based Text Classification]].
----
----
== References ==
 
=== 2017 ===
* https://medium.com/paper-club/recurrent-convolutional-neural-networks-for-text-classification-107020765e52
 
=== 2016 ===
* https://medium.com/@surmenok/character-level-convolutional-networks-for-text-classification-d582c0c36ace
 
=== 2015a ===
* ([[2015_RecurrentConvolutionalNeuralNet|Lai et al., 2015]]) &rArr; [[::Siwei Lai]], [[::Liheng Xu]], [[::Kang Liu]], and [[::Jun Zhao]]. ([[::2015]]). &ldquo;[http://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/download/9745/9552 Recurrent Convolutional Neural Networks for Text Classification].&rdquo; In: [[Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence]]. ISBN:0-262-51129-0
** QUOTE: ... In contrast to traditional [[text classification algorithm|method]]s, [[2015 RecurrentConvolutionalNeuralNet|we]] introduce a [[recurrent convolutional neural network]] for [[text classification]] without [[human-designed feature]]s. </s> In [[our model]], [[2015 RecurrentConvolutionalNeuralNet|we]] apply a [[recurrent structure]] to capture [[contextual information]] as far as possible when [[learning word representation]]s, which may introduce considerably less [[noise]] compared to traditional [[window-based neural network]]s. </s> [[2015 RecurrentConvolutionalNeuralNet|We]] also employ a [[max-pooling layer]] that [[automatically judge]]s which [[word]]s play key roles in [[text classification]] to capture the [[key component]]s in [[text]]s. </s> ...
 
=== 2015b ===
* ([[Zhang, Zhao and LeCun, 2015]]) ⇒ [[Xiang Zhang]], [[Junbo Zhao]], and [[Yann LeCun]]. ([[2015]]). “Character-level Convolutional Networks for Text Classification.” In: Advances in neural information processing systems, pp. 649-657. </s>
** ABSTRACT: This article offers an empirical exploration on the use of [[character-level]] [[convolutional networks (ConvNets)]] for [[text classification]]. We constructed several large-scale datasets to show that [[character-level convolutional networks]] could achieve state-of-the-art or competitive results. Comparisons are offered against traditional models such as bag of words, n-grams and their TFIDF variants, and deep learning models such as word-based ConvNets and recurrent neural networks.
 
=== 2014 ===
* ([[Johnson & Zhang, 2014]]) ⇒ [[Rie Johnson]], and [[Tong Zhang]]. ([[2014]]). “Effective Use of Word Order for Text Categorization with Convolutional Neural Networks.” arXiv preprint arXiv:1412.1058 </s>
** ABSTRACT: [[Convolutional neural network (CNN)]] is a neural network that can make use of the internal structure of data such as the 2D structure of image data. This paper studies CNN on text categorization to exploit the 1D structure (namely, word order) of text data for accurate prediction. Instead of using [[low-dimensional word vector]]s as input as is often done, we directly apply CNN to high-dimensional text data, which leads to directly learning embedding of small text regions for use in classification. In addition to a straightforward adaptation of CNN from image to text, a simple but new variation which employs bag-of-word conversion in the convolution layer is proposed. An extension to combine multiple convolution layers is also explored for higher accuracy. The experiments demonstrate the effectiveness of our approach in comparison with state-of-the-art methods.
 
----
__NOTOC__
[[Category:Concept]]

Latest revision as of 18:12, 28 May 2018