Echo State Network

From GM-RKB
Jump to navigation Jump to search

An Echo State Network (ESN) is a Reservoir Computing Neural Network with a randomly connected hidden layer.



References

2018a

  • (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Echo_state_network Retrieved:2018-3-4.
    • The echo state network (ESN), [1] [2] is a recurrent neural network with a sparsely connected hidden layer (with typically 1% connectivity). The connectivity and weights of hidden neurons are fixed and randomly assigned. The weights of output neurons can be learned so that the network can (re)produce specific temporal patterns. The main interest of this network is that although its behaviour is non-linear, the only weights that are modified during training are for the synapses that connect the hidden neurons to output neurons. Thus, the error function is quadratic with respect to the parameter vector and can be differentiated easily to a linear system. Alternatively, one may consider a nonparametric Bayesian formulation of the output layer, under which: (i) a prior distribution is imposed over the output weights; and (ii) the output weights are marginalized out in the context of prediction generation, given the training data. This idea has been demonstrated in [3] by using Gaussian priors, whereby a Gaussian process model with ESN-driven kernel function is obtained. Such a solution was shown to outperform ESNs with trainable (finite) sets of weights in several benchmarks.

      Some publicly available implementations of ESNs are: (i) aureservoir: an efficient C++ library for various kinds of echo state networks with python/numpy bindings; and (ii) Matlab code: an efficient matlab for an echo state network.

  1. Herbert Jaeger and Harald Haas. Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication. Science 2 April 2004: Vol. 304. no. 5667, pp. 78 – 80 PDF (preprint)
  2. Herbert Jaeger (2007) Echo State Network. Scholarpedia.
  3. Sotirios P. Chatzis, Yiannis Demiris, “Echo State Gaussian Process,” IEEE Transactions on Neural Networks, vol. 22, no. 9, pp. 1435-1445, Sep. 2011. [1]

2018b

2017

2013

2012

2008

2007

2003

2002