I am trying to implement a simple neural network for XOR function. The activation function I am using is Sigmoid function. The code for the sigmoid function is:
def ActivationFunction(a)
e = 2.671 # Sigmoid Function
expo = e ** a
val = expo / (1 + expo)
return val
My problem is that this function is always returning a value between 0.7 and 0.8. This problem is showing a major effect in the output process.
Any suggestions would be appriciated.
Your function is implemented correctly, however, the value of
e
is incorrect.I'd recommend importing
math
and using the predefinede
constant from there.And, accordingly, the derivative:
Where
a
is the hidden activation from the forward pass.Besides this, I see nothing wrong with your implementation. So if you're still getting values you don't expect after the fix, the cause of the trouble lies elsewhere.