Is use of Leaky ReLu after Batch Normalization (BN

2019-10-18 17:51发布

问题:

In my CNN network i am using i am using Leaky ReLu after BN layer. Leaky ReLu solves dying ReLu problem by adding f(y)=ay for negative values. BN introduces zero mean and unit variance. So is BN remove negative part or not i.e. is this converts all valus into 0 to 1 scale? Based on this only selection of Leaky ReLu will be done. Because if BN remove negative part then use of Leaky relu will be same as relu. I am using keras.

回答1:

The BN layer tries to zero-mean its output by subtracting an expectation over inputs. So we can expect some of its output values to be negative.

So the LeakyReLU following the BN layer will still receive negative values.