I tried far to long to solve this problem and did not find anything useful on the Internet so I have to ask:
Given a tensor T
, let's say T = tf.random_normal([100])
, I want to apply softmax()
only to the positive elements of the tensor. Something like T = tf.nn.softmax(T[T>0])
which of course does not work in Tensorflow.
In short: I want to compute softmax and applied only on elements T > 0
.
How can I do that in Tensorflow?
I am new to Tensorflow but this is my try, based on the math formula:
Updated Version(using @Aldream sugestion):
If you want softmax computed + applied only to elements T > 0:
An idea could be to create 2 partitions based on your condition (
T > 0
), apply the operation (softmax
) to the target partition, then stitch them back together.Something like this, using
tf.dynamic_partition
andtf.dynamic_stitch
:Previous answer
This answer is valid only if you want softmax to be computed over all elements of
T
but applied only to those greater than0
.Using
tf.where()
: