Hebbian Learning Algorithm
(Redirected from Hebbian Learning)
A Hebbian Learning Algorithm is a neural learning algorithm where a synapse is strengthened according to a Hebb rule (when the neurons on either side of the synapse (input and output) have highly correlated outputs).
- See: Synaptic Plasticity; Spike Timing Dependent Plasticity; Dimensionality Reduction; Reinforcement Learning; Self-Organizing Maps.
- QUOTE: Hebbian learning is one of the oldest learning algorithms, and is based in large part on the dynamics of biological systems. A synapse between two neurons is strengthened when the neurons on either side of the synapse (input and output) have highly correlated outputs. In essence, when an input neuron fires, if it frequently leads to the firing of the output neuron, the synapse is strengthened. Following the analogy to an artificial system, the tap weight is increased with high correlation between two sequential neurons.
- (Sammut & Webb, 2011) ⇒ Claude Sammut, and Geoffrey I. Webb. (2011). “Hebbian Learning.” In: (Sammut & Webb, 2011) p.493
- (Caporale & Dan, 2008) ⇒ Natalia Caporale, and Yang Dan. (2008). “Spike Timing-dependent Plasticity: A Hebbian Learning Rule." Annu. Rev. Neurosci. 31
- (Song et al., 2000) ⇒ Sen Song, Kenneth D. Miller, and Larry F. Abbott. (2000). “Competitive Hebbian Learning through Spike-timing-dependent Synaptic Plasticity." Nature neuroscience 3, no. 9
- (Montagueead et al., 1996) ⇒ P. R. Montagueead, Peter Dayan, and Terrence J. Sejnowski. (1996). “A Framework for Mesencephalic Dopamine Systems based on Predictive Hebbian Learning." The Journal of neuroscience 16, no. 5