2009 LearningSemanticCorrespondences

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Joint Inference, Semi-Supervised Information Extraction Algorithm, Weather.gov Dataset.

Notes

Cited By

Quotes

Abstract

A central problem in grounded language acquisition is learning the correspondences between a rich world state and a stream of text which references that world state. To deal with the high degree of ambiguity present in this setting, we present a generative model that simultaneously segments the text into utterances and maps each utterance to a meaning representation grounded in the world state. We show that our model generalizes across three domains of increasing difficulty---Robocup sportscasting, weather forecasts (a new domain), and NFL recaps.

References

BibTex

@inproceedings{2009_LearningSemanticCorrespondences,
  author    = {Percy Liang and
               Michael I. Jordan and
               Dan Klein},
  editor    = {Keh- Yih Su and
               Jian Su and
               Janyce Wiebe},
  title     = Learning Semantic Correspondences with Less Supervision},
  booktitle = {Proceedings of the 47th Annual Meeting of the Association
               for Computational Linguistics and the 4th International Joint Conference
               on Natural Language Processing of the AFNLP (ACL 2009), 2-7 August 2009, Singapore},
  pages     = {91--99},
  publisher = {The Association for Computer Linguistics},
  year      = {2009},
  url       = {https://www.aclweb.org/anthology/P09-1011/},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2009 LearningSemanticCorrespondencesDan Klein
Percy Liang
Michael I. Jordan
Learning Semantic Correspondences with Less Supervision2009