2017 LearningtoGenerateOneSentenceBi

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Sequence-To-Sequence Model

Notes

Cited By

Quotes

Abstract

We investigate the generation of one-sentence Wikipedia biographies from facts derived from Wikidata slot-value pairs. We train a recurrent neural network sequence-to-sequence model with attention to select facts and generate textual summaries. Our model incorporates a novel secondary objective that helps ensure it generates sentences that contain the input facts. The model achieves a BLEU score of 41, improving significantly upon the vanilla sequence-to-sequence model and scoring roughly twice that of a simple template baseline. Human preference evaluation suggests the model is nearly as good as the Wikipedia reference. Manual analysis explores content selection, suggesting the model can trade the ability to infer knowledge against the risk of hallucinating incorrect information.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2017 LearningtoGenerateOneSentenceBiAndrew Chisholm
Will Radford
Ben Hachey
Learning to Generate One-sentence Biographies from Wikidata