28,703
edits
(ContinuousReplacement) Tag: continuous replacement |
|||
Line 18: | Line 18: | ||
On the [[Stanford Natural Language Inference (SNLI) dataset]], [[2016_ADecomposableAttentionModelforN|we]] obtain [[state-of-the-art results]] with almost an [[order of magnitude]] [[fewer parameter]]s than [[previous work]] and without relying on any [[word-order information]]. </s> | On the [[Stanford Natural Language Inference (SNLI) dataset]], [[2016_ADecomposableAttentionModelforN|we]] obtain [[state-of-the-art results]] with almost an [[order of magnitude]] [[fewer parameter]]s than [[previous work]] and without relying on any [[word-order information]]. </s> | ||
Adding [[intra-sentence attention]] that takes a [[minimum amount]] of order into account yields further improvements. </s> | Adding [[intra-sentence attention]] that takes a [[minimum amount]] of order into account yields further improvements. </s> | ||
=== 1 Introduction === | === 1 Introduction === | ||
=== 2 Related Work === | === 2 Related Work === | ||
=== 3 Approach === | === 3 Approach === | ||
==== 3.1 Attend ==== | ==== 3.1 Attend ==== | ||
==== 3.2 Compare ==== | ==== 3.2 Compare ==== | ||
==== 3.3 Aggregate ==== | ==== 3.3 Aggregate ==== | ||
==== 3.4 Intra-Sentence Attention (Optional) ==== | ==== 3.4 Intra-Sentence Attention (Optional) ==== | ||
=== 4 Computational Complexity === | === 4 Computational Complexity === | ||
=== 5 Experiments === | === 5 Experiments === | ||
==== 5.1 Implementation Details ==== | ==== 5.1 Implementation Details ==== | ||
==== 5.2 Results ==== | ==== 5.2 Results ==== | ||
=== 6 Conclusion === | === 6 Conclusion === | ||
=== Acknowledgements === | === Acknowledgements === | ||
edits