I have a neural net of 3 hidden layers (so I have 5 layers in total). I want to use Rectified Linear Units at each of the hidden layers, but at the outermost layer I want to apply Softmax on the logits. I want to use the DNNClassifier
. I have read the official documentation of the TensorFlow where for setting value of the parameter activation_fn
they say:
activation_fn: Activation function applied to each layer. If None, will use tf.nn.relu.
I know I can always write my own model and use any arbitrary combination of the activation functions. But as the DNNClassifier
is more concrete, I want to resort to that. So far I have:
classifier = tf.contrib.learn.DNNClassifier(
feature_columns=features_columns,
hidden_units=[10,20,10],
n_classes=3
# , activation_fn:::: I want something like below
# activation_fn = [relu,relu,relu,softmax]
)