2012 ArchitecturalDesignsofEchoState

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Echo State Network; Reservoir Computing.

Notes

Cited By

Quotes

Abstract

Reservoir computing (RC) refers to a new class of state-space models with a fixed state transition structure (the “œreservoir€") and an adaptable readout from the state space. The reservoir is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be exploited by the reservoir-to-output readout mapping. The field of RC has been growing rapidly with many successful applications. However, RC has been criticised for not being principled enough. Reservoir construction is largely driven by a series of randomised model building stages, with both researchers and practitioners having to rely on a series of trials and errors. Echo State Networks (ESNs), Liquid State Machines (LSMs) and the back-propagation decorrelation neural network (BPDC) are examples of popular RC methods. In this thesis we concentrate on Echo State Networks, one of the simplest, yet effective forms of reservoir computing.

Echo State Network (ESN) is a recurrent neural network with a non-trainable sparse recurrent part (reservoir) and an adaptable (usually linear) readout from the reservoir. Typically, the reservoir connection weights, as well as the input weights are randomly generated. ESN has been successfully applied in time-series prediction tasks, speech recognition, noise modelling, dynamic pattern classification, reinforcement learning, and in language modelling, and according to the authors, they performed exceptionally well.

In this thesis, we propose simplified topologies of the original ESN architecture and we experimentally show that a Simple Cycle Reservoir (SCR) achieved comparable performances to ‘standard’ ESN on a variety of data sets of different origin and memory structure, hence, most tasks modelled by ESNs can be handled with very simple model structures. We also proved that the memory capacity of linear SCR can be made arbitrarily close to the proven optimal value (for any recurrent neural network of the ESN form).

Furthermore, we propose to extend the simple cycle reservoir (SCR) with a regular structure of shortcuts (Jumps) - Cycle Reservoir with Jumps (CRJ). In the spirit of SCR we keep the reservoir construction simple and deterministic. We show that such a simple architecture can significantly outperform both the SCR and standard randomised ESN. Prompted by these results, we investigate some well known reservoir characterisations, such as eigenvalue distribution of the reservoir matrix, pseudo-Lyapunov exponent of the input-driven reservoir dynamics, or memory capacity and their relation to the ESN performance.

Moreover, we also design and utilise an ensemble of ESNs with diverse reservoirs whose collective readout is obtained through Negative Correlation Learning (NCL) of ensemble of Multi-Layer Perceptrons (MLP), where each individual MPL realises the readout from a single ESN. Experimental results on three data sets confirm that, compared with both single ESN and flat ensembles of ESNs, NCL based ESN ensembles achieve better generalisation performance. In the final part of the thesis, we investigate the relation between two quantitative measures suggested in the literature to characterise short term memory in input driven dynamical systems, namely the short term memory capacity spectrum and the Fisher memory curve.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2012 ArchitecturalDesignsofEchoStateAli RodanArchitectural Designs of Echo State Network