# Restricted Boltzmann Machine (RBM)

## References

### 2017a

• (Hinton, 2017) ⇒ Hinton G. (2017). "Boltzmann Machines". In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining.
• QUOTE: A restricted Boltzmann machine (Smolensky 1986) consists of a layer of visible units and a layer of hidden units with no visible-visible or hidden-hidden connections. With these restrictions, the hidden units are conditionally independent given a visible vector, so unbiased samples from $\langle s_i s_j\rangle_{data}$ can be obtained in one parallel step. To sample from $\langle s_is_j\rangle_{model}$ still requires multiple iterations that alternate between updating all the hidden units in parallel and updating all of the visible units in parallel. However, learning still works well if $\langle s_is_j\rangle_{model}$ is replaced by $\langle s_is_j\rangle_{reconstruction}$ which is obtained as follows:
1. Starting with a data vector on the visible units, update all of the hidden units in parallel.
2. Update all of the visible units in parallel to get a “reconstruction.”
3. Update all of the hidden units again.
This efficient learning procedure approximates gradient descent in a quantity called “contrastive divergence” and works well in practice (Hinton 2002).

### 2017c

• (The Asimov Institute, 2017) ⇒ http://asimovinstitute.org/neural-network-zoo/
• QUOTE: Restricted Boltzmann machines (RBM) are remarkably similar to BMs (surprise) and therefore also similar to HNs. The biggest difference between BMs and RBMs is that RBMs are a better usable because they are more restricted. They don’t trigger-happily connect every neuron to every other neuron but only connect every different group of neurons to every other group, so no input neurons are directly connected to other input neurons and no hidden to hidden connections are made either. RBMs can be trained like FFNNs with a twist: instead of passing data forward and then back-propagating, you forward pass the data and then backward pass the data (back to the first layer). After that you train with forward-and-back-propagation.

### 2014

• http://deeplearning4j.org/restrictedboltzmannmachine.html
• QUOTE: To quote Geoff Hinton, a Google researcher and university professor, a Boltzmann machine is “a network of symmetrically connected, neuron-like units that make stochastic decisions about whether to be on or off.” (Stochastic means “randomly determined.”)

A restricted Boltzmann machine “consists of a layer of visible units and a layer of hidden units with no visible-visible or hidden-hidden connections.” The “restricted” comes from limits imposed on how its nodes connect: intra-layer connections are not allowed, but each node of one layer connects to every node of the next, and that is called “symmetry.”

### 2006

1. Smolensky, Paul (1986). "Chapter 6: Information Processing in Dynamical Systems: Foundations of Harmony Theory" (PDF). In Rumelhart, David E.; McLelland, James L. Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations. MIT Press. pp. 194–281. [ISBN 0-262-68053-X].
2. Salakhutdinov, R.; Mnih, A.; Hinton, G. (2007). Restricted Boltzmann machines for collaborative filtering. Proceedings of the 24th International Conference on Machine learning - ICML '07. p. 791. doi:10.1145/1273496.1273596. ISBN 9781595937933
3. Coates, Adam; Lee, Honglak; Ng, Andrew Y. (2011). An analysis of single-layer networks in unsupervised feature learning (PDF). International Conference on Artificial Intelligence and Statistics (AISTATS).
4. Ruslan Salakhutdinov and Geoffrey Hinton (2010). Replicated softmax: an undirected topic model. Neural Information Processing Systems 23.
5. Miguel Á. Carreira-Perpiñán and Geoffrey Hinton (2005). On contrastive divergence learning. Artificial Intelligence and Statistics.