# Difference between revisions of "2013 2013SpecialIssueLearningthePseu"

(ContinuousReplacement) (Tag: continuous replacement) |
(→1. Introduction) |
||

Line 27: | Line 27: | ||

=== 1. Introduction === | === 1. Introduction === | ||

− | The [[solution]] to the [[network weight]]s is exactly the same as that which would be [[calculate]]d by [[singular value decomposition]]. It [[converge]]s with a single [[forward iteration]] per [[input data]] [[sample]], and as such is ideal for [[realtime]] [[online computation]] of the [[pseudoinverse solution]]. It requires significantly less [[memory]] than the [[SVD]] method, as its [[memory requirement]] scales as the [[square]] of the [[size]] of the [[hidden layer]], whereas the [[SVD]] [[memory requirement]] scales with the [[product]] of the [[hidden layer]] [[size]] and the [[size]] of the | + | The [[solution]] to the [[network weight]]s is exactly the same as that which would be [[calculate]]d by [[singular value decomposition]]. It [[converge]]s with a single [[forward iteration]] per [[input data]] [[sample]], and as such is ideal for [[realtime]] [[online computation]] of the [[pseudoinverse solution]]. It requires significantly less [[memory]] than the [[SVD]] method, as its [[memory requirement]] scales as the [[square]] of the [[size]] of the [[hidden layer]], whereas the [[SVD]] [[memory requirement]] scales with the [[product]] of the [[hidden layer]] [[size]] and the [[size]] of the [[training data set]]. Given that the [[data set]] [[size]] should significantly exceed the [[hidden layer]] [[size]], the former is advantageous. We call the [[algorithm OPIUM (Online PseudoInverse Update Method)]]. |

+ | <P> [[OPIUM]] is adapted from an [[iterative method]] for [[computing]] the [[pseudoinverse]], known as [[Greville’s method]] ([[Greville, 1960]]). The existence of [[OPIUM]], and its [[biological plausibility]], suggests that there is no reason why a [[biological neural network]] would not arrive at the same [[weight]]s that are computed using a [[singular value decomposition]], and therefore that this method of [[synthesizing]] [[network structure]]s may be used without the fear that it is [[biologically implausible]]. | ||

=== Copyright === | === Copyright === |

## Revision as of 13:38, 13 July 2019

- (Tapson & Van Schaik, 2013) ⇒ J. Tapson, and A. Van Schaik. (2013). “2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights.” In: Neural Networks Journal, 45. doi:10.1016/j.neunet.2013.02.008

**Subject Headings:** Pseudo-Inverse Algorithm; OPIUM Algorithm.

## Notes

## Cited By

- Google Scholar: 71 citations [1]
- ACM DL: 9 citations [2]
- ScienceDirect Elsevier: 46 citations [3]
- Semantic Scholar: 52 citations [4]

## Quotes

### Author Keywords

### Abstract

The last decade has seen the parallel emergence in computational neuroscience and machine learning of neural network structures which spread the input signal randomly to a higher dimensional space; perform a nonlinear activation; and then solve for a regression or classification output by means of a mathematical pseudoinverse operation. In the field of neuromorphic engineering, these methods are increasingly popular for synthesizing biologically plausible neural networks, but the "learning method"-computation of the pseudoinverse by singular value decomposition-is problematic both for biological plausibility and because it is not an online or an adaptive method. We present an online or incremental method of computing the pseudoinverse precisely, which we argue is biologically plausible as a learning method, and which can be made adaptable for non-stationary data streams. The method is significantly more memory-efficient than the conventional computation of pseudoinverses by singular value decomposition.

### 1. Introduction

The solution to the network weights is exactly the same as that which would be calculated by singular value decomposition. It converges with a single forward iteration per input data sample, and as such is ideal for realtime online computation of the pseudoinverse solution. It requires significantly less memory than the SVD method, as its memory requirement scales as the square of the size of the hidden layer, whereas the SVD memory requirement scales with the product of the hidden layer size and the size of the training data set. Given that the data set size should significantly exceed the hidden layer size, the former is advantageous. We call the algorithm OPIUM (Online PseudoInverse Update Method).

OPIUM is adapted from an iterative method for computing the pseudoinverse, known as Greville’s method (Greville, 1960). The existence of OPIUM, and its biological plausibility, suggests that there is no reason why a biological neural network would not arrive at the same weights that are computed using a singular value decomposition, and therefore that this method of synthesizing network structures may be used without the fear that it is biologically implausible.

### Copyright

2013 Elsevier Ltd. All rights reserved.

## References

;

Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|

2013 2013SpecialIssueLearningthePseu | J. Tapson A. Van Schaik | 2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights | 10.1016/j.neunet.2013.02.008 | 2013 |