2017 LatentSequenceDecompositions

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Latent Sequence Decompositions (LSD) System.

Notes

Cited By

Quotes

Abstract

We present the Latent Sequence Decompositions (LSD) framework. LSD decomposes sequences with variable lengthed output units as a function of both the input sequence and the output sequence. We present a training algorithm which samples valid extensions and an approximate decoding algorithm. We experiment with the Wall Street Journal speech recognition task. Our LSD model achieves 12.9% WER compared to a character baseline of 14.8% WER. When combined with a convolutional network on the encoder, we achieve 9.6% WER.

References

BibTeX

@inproceedings{2017_LatentSequenceDecompositions,
  author    = {William Chan and
               Yu Zhang and
               Quoc V. Le and
               Navdeep Jaitly},
  title     = {Latent Sequence Decompositions},
  booktitle = {Conference Track Proceedings of the 5th International Conference on Learning Representations
              (ICLR 2017)},
  publisher = {OpenReview.net},
  year      = {2017},
  url       = {https://openreview.net/forum?id=SyQq185lg},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2017 LatentSequenceDecompositionsYu Zhang
Quoc V. Le
Navdeep Jaitly
William Chan
Latent Sequence Decompositions2017