Sequence-to-Sequence (seq2seq) Neural Network

From GM-RKB
Jump to: navigation, search

A Sequence-to-Sequence (seq2seq) Neural Network is an encoder-decoder network that is a sequence-to-sequence model.



References

2018a

2018b

2017

Figure 3: Pointer-generator model. For each decoder timestep a generation probability $p_{gen} \in [0,1]$ is calculated, which weights the probability of generating words from the vocabulary, versus copying words from the source text. The vocabulary distribution and the attention distribution are weighted and summed to obtain the final distribution, from which we make our prediction. Note that out-of-vocabulary article words such as 2-0 are included in the final distribution. Best viewed in color.

2017b

2017c

2017d

2016a

2016b

2014a

2014b