2011 ExtensionsofRecurrentNeuralNetw

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Recurrent Neural Network Language Model (RNN LM), Language Model, Back-Propagation Through Time (BPTT) Algorithm.

Notes

Cited By

Quotes

Abstract

We present several modifications of the original recurrent neural net work language model (RNN LM). While this model has been shown to significantly outperform many competitive language modeling techniques in terms of accuracy, the remaining problem is the computational complexity. In this work, we show approaches that lead to more than 15 times speedup for both training and testing phases. Next, we show importance of using a backpropagation through time algorithm. An empirical comparison with feedforward networks is also provided. In the end, we discuss possibilities how to reduce the amount of parameters in the model. The resulting RNN model can thus be smaller, faster both during training and testing, and more accurate than the basic one.

References

BibTeX

@inproceedings{2011_ExtensionsofRecurrentNeuralNetw,
  author    = {Tomas Mikolov and
               Stefan Kombrink and
               Lukas Burget and
               Jan Cernock{\'{y}} and
               Sanjeev Khudanpur},
  title     = {Extensions of Recurrent Neural Network Language Model},
  booktitle = {Proceedings of the IEEE International Conference on Acoustics, Speech,
               and Signal Processing (ICASSP 2011)},
  pages     = {5528--5531},
  publisher = {{IEEE}},
  year      = {2011},
  url       = {https://doi.org/10.1109/ICASSP.2011.5947611},
  doi       = {10.1109/ICASSP.2011.5947611},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2011 ExtensionsofRecurrentNeuralNetwStefan Kombrink
Jan Černocký
Tomáš Mikolov
Sanjeev Khudanpur
Lukáš Burget
Extensions of Recurrent Neural Network Language Model2011