2014 ANeuralNetworkforFactoidQuestio

From GM-RKB
Jump to navigation Jump to search

Subject Headings: QANTA, Supervised Question Answering.

Notes

Cited By

Quotes

Abstract

Text classification methods for tasks like factoid question answering typically use manually defined string matching rules or bag of words representations. These methods are ineffective when question text contains very few individual words (e.g., named entities) that are indicative of the answer. We introduce a recursive neural network (RNN) model that can reason over such input by modeling textual compositionality. We apply our model, QANTA, to a dataset of questions from a trivia competition called quiz bowl. Unlike previous RNN models, QANTA learns word and phrase-level representations that combine across sentences to reason about entities. The model outperforms multiple baselines and, when combined with information retrieval methods, rivals the best human players.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2014 ANeuralNetworkforFactoidQuestioRichard Socher
Mohit Iyyer
Leonardo Claudino
Hal Daumé, III
A Neural Network for Factoid Question Answering over Paragraphs