Deep Residual Neural Network

From GM-RKB
Jump to navigation Jump to search

A Deep Residual Neural Network is a Residual Neural Network (ResNet) that is a Deep Neural Network.



References

2019

2019 DeepResidualNeuralNetworksforAu Fig1.png
Figure 1: Model architecture for the Spec-ResNet model. Detailed structure of residual blocks is shown in 2.

2019 DeepResidualNeuralNetworksforAur Fig2.png
Figure 2: Detailed architecture of the convolution block with residual connection.

2018a

2018 MultiScaleResidualNetworkforIma Fig3.png
Figure 3: The structure of multi-scale residual block (MSRB).

2018b

2017

2016a

2016 DeepResidualLearningforImageRec Fig2.png
Figure 2: Residual learning: a building block.

2016b

2016 IdentityMappingsinDeepResidualN Fig2A.png
2016 IdentityMappingsinDeepResidualN Fig2B.png
2016 IdentityMappingsinDeepResidualN Fig2C.png
Figure 2: Various types of shortcut connections used in Table 1. The grey arrows indicate the easiest paths for the information to propagate. The shortcut connections in (b-f) are impeded by different components. For simplifying illustrations we do not display the BN layers, which are adopted right after the weight layers for all units here.

2016c

[math]\displaystyle{ \mathbf{x}_{l+1}=\mathbf{x}_{l}+\mathcal{F}\left(\mathbf{x}_{l}, \mathcal{W}_{l}\right) }[/math] (1)
where $\mathbf{x}_{l+1}$ and $\mathbf{x}_{l}$ are input and output of the $l$-th unit in the network, $\mathcal{F}$ is a residual function and $\mathcal{W}_{l}$ are parameters of the block. Residual network consists of sequentially stacked residual block.

2016 WideResidualNetwork Fig1.png
Figure 1: Various residual blocks used in the paper. Batch normalization and ReLU precede each convolution (omitted for clarity).