Implement Relu derivative in python numpy

2019-04-08 02:42发布

I'm trying to implement a function that computes the Relu derivative for each element in a matrix, and then return the result in a matrix. I'm using Python and Numpy.

Based on other Cross Validation posts, the Relu derivative for x is 1 when x > 0, 0 when x < 0, undefined or 0 when x == 0

Currently, I have the following code so far:

def reluDerivative(self, x):
    return np.array([self.reluDerivativeSingleElement(xi) for xi in x])

def reluDerivativeSingleElement(self, xi):
    if xi > 0:
        return 1
    elif xi <= 0:
        return 0

Unfortunately, xi is an array because x is an matrix. reluDerivativeSingleElement function doesn't work on array. So I'm wondering is there a way to map values in a matrix to another matrix using numpy, like the exp function in numpy?

Thanks a lot in advance.

9条回答
Evening l夕情丶
2楼-- · 2019-04-08 03:03

If you want to use pure Python:

def relu_derivative(x):
    return max(sign(x), 0)
查看更多
beautiful°
3楼-- · 2019-04-08 03:07
def dRelu(z):
    return np.where(z <= 0, 0, 1)

Here z is a ndarray in my case.

查看更多
对你真心纯属浪费
4楼-- · 2019-04-08 03:10

That's an exercise in vectorization.

This code

if x > 0:
  y = 1
elif xi <= 0:
  y = 0

Can be reformulated into

y = (x > 0) * 1

This is something that will work for numpy arrays, since boolean expressions involving them are turned into arrays of values of these expressions for elements in said array.

查看更多
登录 后发表回答