The only activation function I have seen for neural networks so far is the logistic function. Are there other functions commonly used? If so, how do you choose the activation function?
标签:
machine-learning
相关问题
- How to conditionally scale values in Keras Lambda
- Trying to understand Pytorch's implementation
- ParameterError: Audio buffer is not finite everywh
- How to calculate logistic regression accuracy
- How to parse unstructured table-like data?
相关文章
- How to use cross_val_score with random_state
- How to measure overfitting when train and validati
- McNemar's test in Python and comparison of cla
- How to disable keras warnings?
- Invert MinMaxScaler from scikit_learn
- How should I vectorize the following list of lists
- ValueError: Unknown metric function when using cus
- F1-score per class for multi-class classification
Yeap there are: http://en.wikipedia.org/wiki/Activation_function
Selection depends on the problem.
Hyperbolic Tangent function is another widely used activation function for neural network. The hyperbolic tangent function will produce positive numbers between -1 and 1. Because the hyperbolic tangent activation function has a derivative, it can be used with gradient descent based training methods.
You can also use your own custom activation function but you need to design your own cost function. For novice, it is better you follow the literature.