Residual learning in tensorflow

2019-05-23 10:27发布

inception layer

I am attempting to replicate this image from a research paper. In the image, the orange arrow indicates a shortcut using residual learning and the layer outlined in red indicates a dilated convolution.

In the code below, r5 is the result of the relu seen in the image. I have excluded the code between the relu and the dilation layer for simplicity. In tensorflow, how would I properly combine the result of the relu and dilated convolution to execute the residual shortcut?

#relu layer
r5 = tf.nn.relu(layer5)
...
#dilation layer
h_conv4 = conv3d_dilation(concat1, 1154)

1条回答
Deceive 欺骗
2楼-- · 2019-05-23 10:55

The image is quite straight forward - it says you should add them, so:

#relu layer
r5 = tf.nn.relu(layer5)
...
#dilation layer
h_conv4 = conv3d_dilation(concat1, 1154)

#combined
combined = r5 + h_conv4
查看更多
登录 后发表回答