Difference between revisions of "2013 2013SpecialIssueLearningthePseu"

From GM-RKB
Jump to: navigation, search
(Imported from text file)
 
m (Omoreira moved page Test:2013 2013SpecialIssueLearningthePseu to 2013 2013SpecialIssueLearningthePseu without leaving a redirect)
(One intermediate revision by the same user not shown)
Line 1: Line 1:
 
* ([[2013_2013SpecialIssueLearningthePseu|Tapson & Van Schaik, 2013]]) ⇒ [[author::J. Tapson]], and [[author::A. Van Schaik]]. ([[year::2013]]). “[https://www.westernsydney.edu.au/__data/assets/pdf_file/0003/783156/Tapson,_van_Schaik_-_2013_-_Learning_the_pseudoinverse_solution_to_network_weights.pdf 2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights].” In: Neural Networks Journal, 45. [http://dx.doi.org/10.1016/j.neunet.2013.02.008 doi:10.1016/j.neunet.2013.02.008]  
 
* ([[2013_2013SpecialIssueLearningthePseu|Tapson & Van Schaik, 2013]]) ⇒ [[author::J. Tapson]], and [[author::A. Van Schaik]]. ([[year::2013]]). “[https://www.westernsydney.edu.au/__data/assets/pdf_file/0003/783156/Tapson,_van_Schaik_-_2013_-_Learning_the_pseudoinverse_solution_to_network_weights.pdf 2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights].” In: Neural Networks Journal, 45. [http://dx.doi.org/10.1016/j.neunet.2013.02.008 doi:10.1016/j.neunet.2013.02.008]  
  
<B>Subject Headings:</B> [[Pseudo-Inverse Algorithm]]; [[Opium Algorithm]]
+
<B>Subject Headings:</B> [[Pseudo-Inverse Algorithm]]; [[OPIUM Algorithm]].
  
 
==Notes==
 
==Notes==
Line 15: Line 15:
 
===Abstract===
 
===Abstract===
  
[[The last decade]] has seen the [[parallel emergence]] in [[computational neuroscience]] and [[machine learning of neural network structure]]s which spread the [[input signal randomly]] to a [[higher dimensional space]]; perform a [[nonlinear activation]]; and then solve for a [[regression]] or [[classification output]] by means of a [[mathematical pseudoinverse operation]]. </s>
+
The last [[decade]] has seen the [[parallel emergence]] in [[computational neuroscience]] and [[machine learning]] of [[neural network]] structures which spread the [[input signal]] [[randomly]] to a [[higher dimensional space]]; perform a [[nonlinear activation]]; and then solve for a [[regression]] or [[classification]] [[output]] by means of a [[mathematical pseudoinverse operation]]. </s>
In the field of [[neuromorphic engineering]], these [[method]]s are increasingly popular for [[synthesizing biologically plausible neural network]]s, but the <i> [[earning method''-computation]] of the pseudoinverse by singular value decomposition-is problematic both for [[biological plausibility]] and because [[it]] is not an [[online]] or an [[adaptive method]]. </s>
+
In the field of [[neuromorphic engineering]], these [[method]]s are increasingly popular for [[synthesizing biologically]] plausible [[neural network]]s, but the "[[learning method]]"-[[computation]] of the [[pseudoinverse]] by [[singular value decomposition]]-is problematic both for [[biological plausibility]] and because it is not an [[Online Method|online]] or an [[adaptive method]]. </s>
[[We]] present an [[online or incremental method]] of [[computing]] the [[pseudoinverse precisely]], which we argue is biologically plausible as a [[learning method]], and which can be made adaptable for [[non-stationary data stream]]s. </s>
+
[[2013 2013SpecialIssueLearningthePseu|We]] present an [[Online Mathod|online]] or [[incremental method]] of [[computing]] the [[pseudoinverse]] precisely, which we argue is [[biologically plausible]] as a [[learning method]], and which can be made adaptable for [[non-stationary data stream]]s. </s>
The [[method]] is significantly more [[memory-efficient]] than the [[conventional computation of pseudoinverse]]s by [[singular value decomposition]]. </s>
+
The [[method]] is significantly more [[memory-efficient]] than the [[conventional computation]] of [[pseudoinverse]]s by [[singular value decomposition]]. </s>
  
 
==References==
 
==References==

Revision as of 13:21, 13 July 2019

Subject Headings: Pseudo-Inverse Algorithm; OPIUM Algorithm.

Notes

Cited By


Quotes

Abstract

The last decade has seen the parallel emergence in computational neuroscience and machine learning of neural network structures which spread the input signal randomly to a higher dimensional space; perform a nonlinear activation; and then solve for a regression or classification output by means of a mathematical pseudoinverse operation. In the field of neuromorphic engineering, these methods are increasingly popular for synthesizing biologically plausible neural networks, but the "learning method"-computation of the pseudoinverse by singular value decomposition-is problematic both for biological plausibility and because it is not an online or an adaptive method. We present an online or incremental method of computing the pseudoinverse precisely, which we argue is biologically plausible as a learning method, and which can be made adaptable for non-stationary data streams. The method is significantly more memory-efficient than the conventional computation of pseudoinverses by singular value decomposition.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2013 2013SpecialIssueLearningthePseuJ. Tapson
A. Van Schaik
2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights10.1016/j.neunet.2013.02.0082013
AuthorJ. Tapson + and A. Van Schaik +
doi10.1016/j.neunet.2013.02.008 +
title2013 Special Issue: Learning the Pseudoinverse Solution to Network Weights +
year2013 +