2018 NeuralSpeedReadingviaSkimRNN

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Skim-Reading Task, Skim-RNN.

Notes

Cited By

Quotes

Abstract

Inspired by the principles of speed reading, we introduce Skim-RNN, a recurrent neural network (RNN) that dynamically decides to update only a small fraction of the hidden state for relatively unimportant input tokens. Skim-RNN gives computational advantage over an RNN that always updates the entire hidden state. Skim-RNN uses the same input and output interfaces as a standard RNN and can be easily used instead of RNNs in existing models. In our experiments, we show that Skim-RNN can achieve significantly reduced computational cost without losing accuracy compared to standard RNNs across five different natural language tasks. In addition, we demonstrate that the trade-off between accuracy and speed of Skim-RNN can be dynamically controlled during inference time in a stable manner. Our analysis also shows that Skim-RNN running on a single CPU offers lower latency compared to standard RNNs on GPUs.

Introduction

...

Inspired by the principles of human's speed reading, we introduce Skim-RNN (Figure 1), which makes a fast decision on the significance of each input (to the downstream task) and ‘skims’ through unimportant input tokens by using a smaller RNN to update only a fraction of the hidden state. When the decision is to ‘fully read’, Skim-RNN updates the entire hidden state with the default RNN cell.

Figure 1: The schematic of Skim-RNN on a sample sentence from Stanford Sentiment Treebank: “intelligent and invigorating film”. At time step 1, Skim-RNN makes the decision to read or skim $x_1$ by using Equation 1 on $h_0$ and $x_1$. Since ‘intelligent’ is an important word for sentiment, it decides to read (blue diamond) by obtaining a full-size hidden state with the big RNN and updating the entire previous hidden state. At time step 2, Skim-RNN decides to skim (empty diamond) the word ‘and’ by updating the first few dimensions of the hidden state using small RNN.
...

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2018 NeuralSpeedReadingviaSkimRNNAli Farhadi
Minjoon Seo
Hannaneh Hajishirzi
Sewon Min

{{Publication|doi=|title=Neural Speed Reading via Skim-{RNN}|titleUrl=|abstract=0pub_abstract}}