Boltzmann Machine (BM)

From GM-RKB
(Redirected from Boltzmann machine)
Jump to navigation Jump to search

A Boltzmann Machine (BM) is a stochastic recurrent neural network based on the Boltzmann Distribution.



References

2017a

2017b

2014

  • (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Boltzmann_machine Retrieved:2014-9-23.
    • A Boltzmann machine is a type of stochastic recurrent neural network invented by Geoffrey Hinton and Terry Sejnowski in 1985. Boltzmann machines can be seen as the stochastic, generative counterpart of Hopfield nets. They were one of the first examples of a neural network capable of learning internal representations, and are able to represent and (given sufficient time) solve difficult combinatoric problems. However, due to a number of issues discussed below, Boltzmann machines with unconstrained connectivity have not proven useful for practical problems in machine learning or inference. They are still theoretically intriguing, however, due to the locality and Hebbian nature of their training algorithm, as well as their parallelism and the resemblance of their dynamics to simple physical processes. If the connectivity is constrained, the learning can be made efficient enough to be useful for practical problems.

      They are named after the Boltzmann distribution in statistical mechanics, which is used in their sampling function.

2012

1989

  • C.-Y. Liou, and S.-L. Lin. (1989). “The other variant Boltzmann machine". In: Proceedings of the International Joint Conference on Neural Networks. doi:10.1109/IJCNN.1989.118618.

1986

  • (Hinton & Sejnowski, 1986) ⇒ Hinton, G. E., & Sejnowski, T. J. (1986). "Learning and releaming in boltzmann machines" (PDF). Parallel distributed processing: Explorations in the microstructure of cognition, 1(282-317), 2.
    • QUOTE: Many of the chapters in this volume make use of the ability of a parallel network to perform cooperative searches for good solutions to problems. The basic idea is simple: The weights on the connections between processing units encode knowledge about how things normally (...)

1985

  • (Ackley et al., 1985) ⇒ David H. Ackley, Geoffrey E. Hinton, and Terrence J. Sejnowski. (1985). “A Learning Algorithm for Boltzmann Machines". In: Cognitive Science, 9(1). doi:10.1207/s15516709cog0901_7.
    • ABSTRACT: The computational power of massively parallel networks of simple processing elements resides in the communication bandwidth provided by the hardware connections between elements. These connections can allow a significant fraction of the knowledge of the system to be applied to an instance of a problem in a very short time. One kind of computation for which massively parallel networks appear to be well suited is large constraint satisfaction searches, but to use the connections efficiently two conditions must be met: First, a search technique that is suitable for parallel networks must be found. Second, there must be some way of choosing internal representations which allow the preexisting hardware connections to be used efficiently for encoding the constraints in the domain being searched. We describe a general parallel search method, based on statistical mechanics, and we show how it leads to a general learning rule for modifying the connection strengths so as to incorporate knowledge about a task domain in an efficient way. We describe some simple examples in which the learning algorithm creates internal representations that are demonstrably the most efficient way of using the preexisting connectivity structure.

  1. Hinton GE, Sejnowski TJ (1983) Optimal perceptual inference” (PDF). In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Washington, DC, pp 448–453