Dilated Convolutional Neural Network

From GM-RKB
(Redirected from Dilated ConvNet)
Jump to navigation Jump to search

A Dilated Convolutional Neural Network is a convolutional neural network that ...



References

2017

2016

2015

  • (Yu & Koltun, 2015) ⇒ Fisher Yu, and Vladlen Koltun. (2015). “Multi-scale Context Aggregation by Dilated Convolutions.” In: Proceedings of 4th International Conference on Learning Representations (ICLR-2016).
    • QUOTE: The dilated convolution operator has been referred to in the past as “convolution with a dilated filter”. It plays a key role in the algorithme `a trous, an algorithm for wavelet decomposition (Holschneider et al., 1987; Shensa, 1992). [1]

      We use the term “dilated convolution” instead of “convolution with a dilated filter” to clarify that no “dilated filter” is constructed or represented.

      The convolution operator itself is modified to use the filter parameters in a different way. The dilated convolution operator can apply the same filter at different ranges using different dilation factors. Our definition reflects the proper implementation of the dilated convolution operator, which does not involve construction of dilated filters.

      In recent work on convolutional networks for semantic segmentation, Long et al. (2015) analyzed filter dilation but chose not to use it. Chen et al. (2015a) used dilation to simplify the architecture of Long et al. (2015).

      In contrast, we develop a new convolutional network architecture that systematically uses dilated convolutions for multi-scale context aggregation. …


  1. ! R be discrete 3�3 filters. Consider applying the filters with exponentially increasing dilation: Fi+1 = Fi �2i ki for i = 0; 1; : : : ; n 􀀀 2: (3)