1997 BidirectionalRecurrentNeuralNet

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Bidirectional LSTM.

Notes

2017

Cited By

Quotes

Abstract

In the first part of this paper, a regular recurrent neural network (RNN) is extended to a bidirectional recurrent neural network (BRNN). The BRNN can be trained without the limitation of using input information just up to a preset future frame. This is accomplished by training it simultaneously in positive and negative time direction. Structure and training procedure of the proposed network are explained. In regression and classification experiments on artificial data, the proposed structure gives better results than other approaches. For real data, classification experiments for phonemes from the TIMIT database show the same tendency. In the second part of this paper, it is shown how the proposed bidirectional structure can be easily modified to allow efficient estimation of the conditional posterior probability of complete symbol sequences without making any explicit assumption about the shape of the distribution. For this part, experiments on real data are reported

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
1997 BidirectionalRecurrentNeuralNetMike Schuster
Kuldip K. Paliwal
Bidirectional Recurrent Neural Networks10.1109/78.6500931997