# Difference between revisions of "2019 DialogueNaturalLanguageInferenc"

## Quotes

### Abstract

Consistency is a long standing issue faced by dialogue models. In this paper, we frame the consistency of dialogue agents as natural language inference (NLI) and create a new natural language inference dataset called Dialogue NLI. We propose a method which demonstrates that a model trained on Dialogue NLI can be used to improve the consistency of a dialogue model, and evaluate the method with human evaluation and with automatic metrics on a suite of evaluation sets designed to measure a dialogue modelâs consistency.

### 2 Dialogue Consistency and Natural Language Inference

(...) Natural Language Inference. Natural Language Inference (NLI) assumes a dataset $\mathcal{D} = \{(s_1, s_2)_i,y_i\}^N_{i=1}$which associates an input pair $(s_1,s_2)$ to one of three classes $y \in \{entailment,\; neutral,\; contradiction\}$. Each input item $s_j$ comes from an input space $\mathcal{S}$, which in typical NLI tasks is the space of natural language sentences, i.e. $s_j$ is a sequence of words $(w_1,\cdots ,w_K)$ where each word $w_k$ is from a vocabulary $\mathcal{V}$.

The input $(s_1, s_2)$ are referred to as the premise and hypothesis, respectively, and each label is interpreted as meaning the premise entails the hypothesis, the premise is neutral with respect to the hypothesis, or the premise contradicts the hypothesis. The problem is to learn a function $f_{NLI}(s_1,s_2) \to \{E,N,C\}$ which generalizes to new input pairs.

(...)

## References

;

volumeDate ValuetitletypejournaltitleUrldoinoteyear
2019 DialogueNaturalLanguageInferencDialogue Natural Language Inference2019
 Author Sean Welleck +, Jason Weston +, Arthur Szlam + and Kyunghyun Cho + title Dialogue Natural Language Inference + year 2019 +