Difference between revisions of "2018 NeuralNaturalLanguageInferenceM"

From GM-RKB
Jump to: navigation, search
(Abstract)
(ContinuousReplacement)
(Tag: continuous replacement)
Line 1: Line 1:
* ([[2018_NeuralNaturalLanguageInferenceM|Chen et al., 2018]]) ⇒ [[author::Qian Chen]], [[author::Xiaodan Zhu]], [[author::Zhen-Hua Ling]], [[author::Diana Inkpen]], and [[author::Si Wei]]. ([[year::2018]]). “[https://aclweb.org/anthology/P18-1224 Neural Natural Language Inference Models Enhanced with External Knowledge].” In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL 2018). [http://dx.doi.org/10.18653/v1/p18-1224 doi:10.18653/v1/p18-1224]  
+
* ([[2018_NeuralNaturalLanguageInferenceM|Chen et al., 2018]]) [[author::Qian Chen]], [[author::Xiaodan Zhu]], [[author::Zhen-Hua Ling]], [[author::Diana Inkpen]], and [[author::Si Wei]]. ([[year::2018]]). “[https://aclweb.org/anthology/P18-1224 Neural Natural Language Inference Models Enhanced with External Knowledge].” In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL 2018). [http://dx.doi.org/10.18653/v1/p18-1224 doi:10.18653/v1/p18-1224]  
  
 
<B>Subject Headings:</B> [[Natural Language Inference Task]]; [[Neural Natural Language Inference Task]]
 
<B>Subject Headings:</B> [[Natural Language Inference Task]]; [[Neural Natural Language Inference Task]]
  
==Notes==
+
== Notes ==
  
==Cited By==
+
== Cited By ==
 
* http://scholar.google.com/scholar?q=%222018%22+Neural+Natural+Language+Inference+Models+Enhanced+with+External+Knowledge
 
* http://scholar.google.com/scholar?q=%222018%22+Neural+Natural+Language+Inference+Models+Enhanced+with+External+Knowledge
  
  
==Quotes==
+
== Quotes ==
  
  
===Abstract===
+
=== Abstract ===
  
 
[[Modeling natural language inference]] is a very challenging [[task]]. </s>
 
[[Modeling natural language inference]] is a very challenging [[task]]. </s>
Line 21: Line 21:
 
[[We]] demonstrate that [[the proposed model]]s improve [[neural NLI model]]s to achieve the [[state-of-the-art]] [[performance]] on the [[SNLI]] and [[MultiNLI dataset]]s. </s>
 
[[We]] demonstrate that [[the proposed model]]s improve [[neural NLI model]]s to achieve the [[state-of-the-art]] [[performance]] on the [[SNLI]] and [[MultiNLI dataset]]s. </s>
  
==References==
+
== References ==
 
{{#ifanon:|
 
{{#ifanon:|
  

Revision as of 01:04, 13 September 2019

Subject Headings: Natural Language Inference Task; Neural Natural Language Inference Task

Notes

Cited By


Quotes

Abstract

Modeling natural language inference is a very challenging task. With the availability of large annotated data, it has recently become feasible to train complex models such as neural-network-based inference models, which have shown to achieve the state-of-the-art performance. Although there exist relatively large annotated data, can machines learn all knowledge needed to perform natural language inference (NLI) from these data? If not, how can neural-network-based NLI models benefit from external knowledge and how to build NLI models to leverage it? In this paper, we enrich the state-of-the-art neural natural language inference models with external knowledge. We demonstrate that the proposed models improve neural NLI models to achieve the state-of-the-art performance on the SNLI and MultiNLI datasets.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2018 NeuralNaturalLanguageInferenceMQian Chen
Xiaodan Zhu
Zhen-Hua Ling
Diana Inkpen
Si Wei
Neural Natural Language Inference Models Enhanced with External Knowledge10.18653/v1/p18-12242018
AuthorQian Chen +, Xiaodan Zhu +, Zhen-Hua Ling +, Diana Inkpen + and Si Wei +
doi10.18653/v1/p18-1224 +
titleNeural Natural Language Inference Models Enhanced with External Knowledge +
year2018 +