Is use of Leaky ReLu after Batch Normalization (BN

2019-10-18 18:12发布

In my CNN network i am using i am using Leaky ReLu after BN layer. Leaky ReLu solves dying ReLu problem by adding f(y)=ay for negative values. BN introduces zero mean and unit variance. So is BN remove negative part or not i.e. is this converts all valus into 0 to 1 scale? Based on this only selection of Leaky ReLu will be done. Because if BN remove negative part then use of Leaky relu will be same as relu. I am using keras.

1条回答
爷的心禁止访问
2楼-- · 2019-10-18 18:36

The BN layer tries to zero-mean its output by subtracting an expectation over inputs. So we can expect some of its output values to be negative.

So the LeakyReLU following the BN layer will still receive negative values.

查看更多
登录 后发表回答