2018 NeuralNaturalLanguageInferenceM: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
No edit summary
(ContinuousReplacement)
Tag: continuous replacement
Line 6: Line 6:
* [[ACL Antology]]: https://www.aclweb.org/anthology/P18-1224/
* [[ACL Antology]]: https://www.aclweb.org/anthology/P18-1224/
* [[ArXiv]]: https://arxiv.org/abs/1711.04289
* [[ArXiv]]: https://arxiv.org/abs/1711.04289
=== Copyright ===  
=== Copyright ===  
[[ACL]] materials are Copyright © 1963–2019 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the [https://creativecommons.org/licenses/by-nc-sa/3.0/ Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License]. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a [https://creativecommons.org/licenses/by/4.0/ Creative Commons Attribution 4.0 International License].
[[ACL]] materials are Copyright © 1963–2019 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the [https://creativecommons.org/licenses/by-nc-sa/3.0/ Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License]. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a [https://creativecommons.org/licenses/by/4.0/ Creative Commons Attribution 4.0 International License].

Revision as of 01:14, 13 September 2019

Subject Headings: Natural Language Inference Task; Neural Natural Language Inference Task.

Notes

Copyright

ACL materials are Copyright © 1963–2019 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License.

Cited By


Quotes

Abstract

Modeling natural language inference is a very challenging task. With the availability of large annotated data, it has recently become feasible to train complex models such as neural-network-based inference models, which have shown to achieve the state-of-the-art performance. Although there exist relatively large annotated data, can machines learn all knowledge needed to perform natural language inference (NLI) from these data? If not, how can neural-network-based NLI models benefit from external knowledge and how to build NLI models to leverage it? In this paper, we enrich the state-of-the-art neural natural language inference models with external knowledge. We demonstrate that the proposed models improve neural NLI models to achieve the state-of-the-art performance on the SNLI and MultiNLI datasets.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2018 NeuralNaturalLanguageInferenceMDiana Inkpen
Qian Chen
Xiaodan Zhu
Zhen-Hua Ling
Si Wei
Neural Natural Language Inference Models Enhanced with External Knowledge10.18653/v1/p18-12242018