# Difference between revisions of "2013 2013SpecialIssueLearningthePseu"

(Imported from text file) |
m (Omoreira moved page Test:2013 2013SpecialIssueLearningthePseu to 2013 2013SpecialIssueLearningthePseu without leaving a redirect) |
||

(One intermediate revision by the same user not shown) | |||

Line 1: | Line 1: | ||

* ([[2013_2013SpecialIssueLearningthePseu|Tapson & Van Schaik, 2013]]) ⇒ [[author::J. Tapson]], and [[author::A. Van Schaik]]. ([[year::2013]]). “[https://www.westernsydney.edu.au/__data/assets/pdf_file/0003/783156/Tapson,_van_Schaik_-_2013_-_Learning_the_pseudoinverse_solution_to_network_weights.pdf 2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights].” In: Neural Networks Journal, 45. [http://dx.doi.org/10.1016/j.neunet.2013.02.008 doi:10.1016/j.neunet.2013.02.008] | * ([[2013_2013SpecialIssueLearningthePseu|Tapson & Van Schaik, 2013]]) ⇒ [[author::J. Tapson]], and [[author::A. Van Schaik]]. ([[year::2013]]). “[https://www.westernsydney.edu.au/__data/assets/pdf_file/0003/783156/Tapson,_van_Schaik_-_2013_-_Learning_the_pseudoinverse_solution_to_network_weights.pdf 2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights].” In: Neural Networks Journal, 45. [http://dx.doi.org/10.1016/j.neunet.2013.02.008 doi:10.1016/j.neunet.2013.02.008] | ||

− | <B>Subject Headings:</B> [[Pseudo-Inverse Algorithm]]; [[ | + | <B>Subject Headings:</B> [[Pseudo-Inverse Algorithm]]; [[OPIUM Algorithm]]. |

==Notes== | ==Notes== | ||

Line 15: | Line 15: | ||

===Abstract=== | ===Abstract=== | ||

− | [[ | + | The last [[decade]] has seen the [[parallel emergence]] in [[computational neuroscience]] and [[machine learning]] of [[neural network]] structures which spread the [[input signal]] [[randomly]] to a [[higher dimensional space]]; perform a [[nonlinear activation]]; and then solve for a [[regression]] or [[classification]] [[output]] by means of a [[mathematical pseudoinverse operation]]. </s> |

− | In the field of [[neuromorphic engineering]], these [[method]]s are increasingly popular for [[synthesizing biologically plausible neural network]]s, but the | + | In the field of [[neuromorphic engineering]], these [[method]]s are increasingly popular for [[synthesizing biologically]] plausible [[neural network]]s, but the "[[learning method]]"-[[computation]] of the [[pseudoinverse]] by [[singular value decomposition]]-is problematic both for [[biological plausibility]] and because it is not an [[Online Method|online]] or an [[adaptive method]]. </s> |

− | [[We]] present an [[online or incremental method]] of [[computing]] the [[pseudoinverse | + | [[2013 2013SpecialIssueLearningthePseu|We]] present an [[Online Mathod|online]] or [[incremental method]] of [[computing]] the [[pseudoinverse]] precisely, which we argue is [[biologically plausible]] as a [[learning method]], and which can be made adaptable for [[non-stationary data stream]]s. </s> |

− | The [[method]] is significantly more [[memory-efficient]] than the [[conventional computation of pseudoinverse]]s by [[singular value decomposition]]. </s> | + | The [[method]] is significantly more [[memory-efficient]] than the [[conventional computation]] of [[pseudoinverse]]s by [[singular value decomposition]]. </s> |

==References== | ==References== |

## Revision as of 13:21, 13 July 2019

- (Tapson & Van Schaik, 2013) ⇒ J. Tapson, and A. Van Schaik. (2013). “2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights.” In: Neural Networks Journal, 45. doi:10.1016/j.neunet.2013.02.008

**Subject Headings:** Pseudo-Inverse Algorithm; OPIUM Algorithm.

## Notes

## Cited By

- http://scholar.google.com/scholar?q=%222013%22+2013+Special+Issue%3A+Learning+the+Pseudoinverse+Solution+to+Network+Weights
- http://dl.acm.org/citation.cfm?id=2514178.2514437&preflayout=flat#citedby

## Quotes

### Abstract

The last decade has seen the parallel emergence in computational neuroscience and machine learning of neural network structures which spread the input signal randomly to a higher dimensional space; perform a nonlinear activation; and then solve for a regression or classification output by means of a mathematical pseudoinverse operation. In the field of neuromorphic engineering, these methods are increasingly popular for synthesizing biologically plausible neural networks, but the "learning method"-computation of the pseudoinverse by singular value decomposition-is problematic both for biological plausibility and because it is not an online or an adaptive method. We present an online or incremental method of computing the pseudoinverse precisely, which we argue is biologically plausible as a learning method, and which can be made adaptable for non-stationary data streams. The method is significantly more memory-efficient than the conventional computation of pseudoinverses by singular value decomposition.

## References

;

Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|

2013 2013SpecialIssueLearningthePseu | J. Tapson A. Van Schaik | 2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights | 10.1016/j.neunet.2013.02.008 | 2013 |