Neural Network Max-Pooling Layer

From GM-RKB
Jump to navigation Jump to search

A Neural Network Max-Pooling Layer is a Neural Network Pooling Layer that selects the maximum element from the region of the feature map covered by the filter.



References

2020a

2020b

2020c

2019

2019 DeepGeneralizedMaxPooling Fig1.png
Figure 1: Overview of Deep Generalized Max Pooling. The activation volume that is computed from a convolutional layer serves as input for the DGMP layer. A linear optimization problem with $D$ unknowns is solved using each local activation vector along the depth axis of the activation volume as linear equation with $D$ unknowns. The output is a weighted sum of the local activation vectors.

2014

2018

  • (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Convolutional_neural_network#Pooling Retrieved:2018-3-4.
    • Convolutional networks may include local or global pooling layers, which combine the outputs of neuron clusters at one layer into a single neuron in the next layer[1][2]. For example, max pooling uses the maximum value from each of a cluster of neurons at the prior layer[3]. Another example is average pooling, which uses the average value from each of a cluster of neurons at the prior layer.
  1. Ciresan, Dan; Ueli Meier; Jonathan Masci; Luca M. Gambardella; Jurgen Schmidhuber (2011). "Flexible, High Performance Convolutional Neural Networks for Image Classification" (PDF). Proceedings of the Twenty-Second international joint conference on Artificial Intelligence-Volume Volume Two. 2: 1237–1242. Retrieved 17 November 2013.
  2. Krizhevsky, Alex. "ImageNet Classification with Deep Convolutional Neural Networks" (PDF). Retrieved 17 November 2013.
  3. Ciresan, Dan; Meier, Ueli; Schmidhuber, Jürgen (June 2012). "Multi-column deep neural networks for image classification". 2012 IEEE Conference on Computer Vision and Pattern Recognition. New York, NY: Institute of Electrical and Electronics Engineers (IEEE): 3642–3649. arXiv:1202.2745v1 Freely accessible. doi:10.1109/CVPR.2012.6248110. ISBN 978-1-4673-1226-4. OCLC 812295155. Retrieved 2013-12-09.