2000 AMaximumEntropyInspiredParser

From GM-RKB
Jump to navigation Jump to search

Subject Heading: Charniak Parser, Syntactic Parser.

Notes

Cited By

2003

Quotes

Abstract



References

  • 1 Adam L. Berger, Vincent J. Della Pietra, Stephen A. Della Pietra, A maximum entropy approach to natural language processing, Computational Linguistics, v.22 n.1, p.39-71, March 1996
  • 2 Sharon A. Caraballo, Eugene Charniak, New figures of merit for best-first probabilistic chart parsing, Computational Linguistics, v.24 n.2, p.275-298, June 1998
  • 3 Eugene Charniak Tree-bank grammars. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence. AAAI Press/MIT Press, Menlo Park, 1996, 1031--1036.
  • 4 Eugene Charniak, Expected-Frequency Interpolation, Brown University, Providence, RI, 1996
  • 5 Eugene Charniak Statistical Parsing with a Context-Free Grammar and Word Statistics. In: Proceedings of the Fourteenth National Conference on Artificial Intelligence. AAAI Press/MIT Press, Menlo Park, CA, 1997, 598--603.
  • 6 Eugene Charniak Statistical techniques for natural language parsing. AI Magazine 18 4 (1997), 33--43.
  • 7 Eugene Charniak, Goldwater, S. and Johnson, M. Edge-based best-first chart parsing. In: Proceedings of the Sixth Workshop on Very Large Corpora. 1998, 127--133.
  • 8 Eugene Charniak, Hendrickson, C., Jacobson, N. and Perkowitz, M. Equations for part-of-speech tagging. In: Proceedings of the Eleventh National Conference on Artificial Intelligence. AAAI Press/MIT Press, Menlo Park, 1993, 784--789.
  • 9 Michael Collins, Mitchell P. Marcus, Head-driven statistical models for natural language parsing, 1999
  • 10 Michael Collins, Three generative, lexicalised models for statistical parsing, Proceedings of the 35th annual meeting on Association for Computational Linguistics, p.16-23, July 07-12, 1997, Madrid, Spain doi:10.3115/979617.979620
  • 11 Darroch, J. N. and Ratcliff, D. Generalized iterative scaling for log-linear models. Annals of Mathematical Statistics 33 (1972), 1470--1480.
  • 12 Eisner, J. M. An empirical comparison of probability models for dependency grammar. Institute for Research in Cognitive Science, University of Pennsylvania, Technical Report IRCS-96-11, (1996).
  • 13 Henderson, J. C. and Brill, E. Exploiting diversity in natural language processing: combining parsers. In 1999 Joint Sigdat Conference on Empirical Methods in Natural Language Processing and Very Large Corpora. ACL, New Brunswick NJ, 1999, 187--194.
  • 14 Mark Johnson, PCFG models of linguistic tree representations, Computational Linguistics, v.24 n.4, p.613-632, December 1998
  • 15 David M. Magerman, Statistical decision-tree models for parsing, Proceedings of the 33rd annual meeting on Association for Computational Linguistics, p.276-283, June 26-30, 1995, Cambridge, Massachusetts doi:10.3115/981658.981695
  • 16 Mitchell P. Marcus, Mary Ann Marcinkiewicz, Beatrice Santorini, Building a large annotated corpus of English: the penn treebank, Computational Linguistics, v.19 n.2, June 1993
  • 17 Adwait Ratnaparkhi, Learning to Parse Natural Language with Maximum Entropy Models, Machine Learning, v.34 n.1-3, p.151-175, Feb. 1999,


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2000 AMaximumEntropyInspiredParserEugene CharniakA Maximum-Entropy-Inspired ParserProceedings of the 1st North American Chapter of the Association for Computational Linguistics Conferencehttp://acl.ldc.upenn.edu/A/A00/A00-2018.pdf2000