Residual learning in tensorflow

2019-05-23 10:53发布

问题:

I am attempting to replicate this image from a research paper. In the image, the orange arrow indicates a shortcut using residual learning and the layer outlined in red indicates a dilated convolution.

In the code below, r5 is the result of the relu seen in the image. I have excluded the code between the relu and the dilation layer for simplicity. In tensorflow, how would I properly combine the result of the relu and dilated convolution to execute the residual shortcut?

#relu layer
r5 = tf.nn.relu(layer5)
...
#dilation layer
h_conv4 = conv3d_dilation(concat1, 1154)

回答1:

The image is quite straight forward - it says you should add them, so:

#relu layer
r5 = tf.nn.relu(layer5)
...
#dilation layer
h_conv4 = conv3d_dilation(concat1, 1154)

#combined
combined = r5 + h_conv4