-->

Tensorflow在输出层施加不同的激活功能(Tensorflow apply different

2019-09-27 09:23发布

我想建立这样的网络

隐藏层并不重要,我想知道我可以在我的输出层编写代码,下面是我的代码,是吗?

参数:

state_dim = 13

layer1_size, layer2_size = 400, 300

action_dim = 2

W1 = self.variable([state_dim,layer1_size],state_dim)
b1 = self.variable([layer1_size],state_dim)
W2 = self.variable([layer1_size,layer2_size],layer1_size)
b2 = self.variable([layer2_size],layer1_size)
W3 = tf.Variable(tf.random_uniform([layer2_size,action_dim],-0.003, 0.003))
b3 = tf.Variable(tf.random_uniform([action_dim],-0.003,0.003))

layer1 = tf.matmul(state_input,W1) + b1
layer1_bn = self.batch_norm_layer(layer1,training_phase=is_training,scope_bn='batch_norm_1',activation=tf.nn.relu)
layer2 = tf.matmul(layer1_bn,W2) + b2
layer2_bn = self.batch_norm_layer(layer2,training_phase=is_training,scope_bn='batch_norm_2',activation=tf.nn.relu)
action = tf.matmul(layer2_bn, W3) + b3
action_linear = tf.sigmoid(action[:, None, 0])
action_angular = tf.tanh(action[:, None, 1])
action = tf.concat([action_linear, action_angular], axis=-1)
文章来源: Tensorflow apply different activation functions in output layer