2019 ReservoirComputingUsingDiffusiv

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Reservoir Computing.

Notes

Cited By

Quotes

Abstract

Reservoir computing (RC) is a framework that can extract features from a temporal input into a higher‐dimension feature space. The reservoir is followed by a readout layer that can analyze the extracted features to accomplish tasks such as inference and classification. RC systems inherently exhibit an advantage, since the training is only performed at the readout layer, and therefore they are able to compute complicated temporal data with a low training cost. Herein, a physical reservoir computing system using diffusive memristor‐based reservoir and drift memristor‐based readout layer is experimentally implemented. The rich nonlinear dynamic behavior exhibited by a diffusive memristor due to Ag migration and the robust in situ training of drift memristor arrays makes the combined system ideal for temporal pattern classification. It is then demonstrated experimentally that the RC system can successfully identify handwritten digits from the Modified National Institute of Standards and Technology (MNIST) dataset, achieving an accuracy of 83%.

Research

(...)

Initially, reservoir computing (RC) was an RNN‐based framework and hence is suitable for temporal/sequential information processing [4]. RNN models of echo state networks (ESNs) [5] and liquid state machines (LSMs) [6] were proposed independently. These aforementioned models led to the development of the unified computational framework of RC [7]. The backpropagation decorrelation (BPDC) [8] learning is also viewed as a predecessor of RC.

The input data is transformed into spatio-temporal patterns in a high‐dimensional space by an RNN in the reservoir (Figure 1a). Subsequently, the spatio-temporal patterns generated are analyzed for a matching pattern in the readout. The input weights and the weights of the recurrent connections in the reservoir are fixed. The only weights that need to be trained are the weights in the readout layer. This can be done using a simple algorithm‐like linear regression. This offers an inherent advantage, since such simple and fast training reduces the computational cost of learning compared with standard RNNs [9]. RC models have been used for various computational problems such as temporal pattern classification, prediction, and generation. However, to maximize the effectiveness of a certain RC system, it is necessary to appropriately represent sample data and optimize the design of the RNN‐based reservoir [7].

Figure 1 Reservoir computing system based on diffusive memristor: a) Schematic of an RC system, showing the reservoir with internal dynamics and a readout function. The weight matrix connecting the reservoir state and the output needs to be trained. b) Equivalent schematic of a simplified system where the reservoir is populated with nodes with recurrent connections having a magnitude less than 1. c) The conductivity of the diffusive memristor is influenced by the periodic voltage stimulation that is provided on its top electrode (+) while grounding the bottom electrode (−). In the top panel, in two consecutive time slots, a voltage stimulus is provided. This results in a continuous Ag filament and high conductivity when the device state is analyzed in the fourth time slot. In the middle panel, in three consecutive time slots, a voltage stimulus is applied resulting in a much thicker filament and even higher conductivity when the device state is analyzed in the fourth time slot. In case of the bottom panel, a voltage stimulus is applied only in the first time slot. When the device state is evaluated in the fourth time slot, the filament has already broken down resulting in very low conductivity. This is because the device has enough time to relax back to its initial high resistance state, owing to its volatile nature. The diffusive memristor is analogous to a node with a recurrent connection having a weight less than 1. Hence, it is always decaying its state in every time frame unless a sufficiently large input is provided to counteract the effect of the feedback.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2019 ReservoirComputingUsingDiffusivRivu Midya
Zhongrui Wang
Shiva Asapu
Xumeng Zhang
Mingyi Rao
Wenhao Song
Ye Zhuo
Navnidhi Upadhyay
Qiangfei Xia
J. Joshua Yang
Reservoir Computing Using Diffusive MemristorsAdvanced Intelligent Systems Journal10.1002/aisy.2019000842019