Tensorflow embedding lookup with unequal sized lis

2019-04-10 13:22发布

问题:

Hej guys, I'm trying to project multi labeled categorical data into a dense space using embeddings.

Here's an toy example. Let's say I have four categories and want to project them into a 2D space. Furthermore I got two instances, the first one belonging to category 0 and the second one to category 1.

The code will look something like this:

sess = tf.InteractiveSession()
embeddings =  tf.Variable(tf.random_uniform([4, 2], -1.0, 1.0))
sess.run(tf.global_variables_initializer())
y = tf.nn.embedding_lookup(embeddings, [0,1])
y.eval()

and return something like this:

array([[ 0.93999457, -0.83051205],
       [-0.1699729 ,  0.73936272]], dtype=float32)

So far, so good. Now imagine an instance belongs to two categories. The embedding lookup will return two vectors which I can reduce by mean for example:

y = tf.nn.embedding_lookup(embeddings, [[0,1],[1,2]]) # two categories
y_ = tf.reduce_mean(y, axis=1)
y_.eval()

This works just like I expect it as well. My problem now arises when instances in my batch are not belonging to the same amount of categories e.g.:

y = tf.nn.embedding_lookup(embeddings, [[0,1],[1,2,3]]) # unequal sized lists
y_ = tf.reduce_mean(y, axis=1)
y_.eval()

ValueError: Argument must be a dense tensor: [[0, 1], [1, 2, 3]] - got shape [2], but wanted [2, 2].

Any idea about how to get around this problem?