Tensorflow op in Keras model

2020-03-04 11:29发布

I'm trying to use a tensorflow op inside a Keras model. I previously tried to wrap it with a Lambda layer but I believe this disables that layers' backpropagation.

More specifically, I'm trying to use the layers from here in a Keras model, without porting it to Keras layers (I hope to deploy to tensorflow later on). I can compile these layers in a shared library form and load these into python. This gives me tensorflow ops and I don't know how to combine this in a Keras model.

A simple example of a Keras MNIST model, where for example one Conv2D layer is replaced by a tf.nn.conv2d op, would be exactly what I'm looking for.

I've seen this tutorial but it appears to do the opposite of what I am looking for. It seems to insert Keras layers into a tensorflow graph. I'm looking to do the exact opposite.

Best regards, Hans

1条回答
我只想做你的唯一
2楼-- · 2020-03-04 11:50

Roughly two weeks have passed and it seems I am able to answer my own question now.

It seems like tensorflow can look up gradients if you register them using this decorator. As of writing, this functionality is not (yet) available in C++, which is what I was looking for. A workaround would be to define a normal op in C++ and wrap it in a python method using the mentioned decorator. If these functions with corresponding gradients are registered with tensorflow, backpropagation will happen 'automagically'.

查看更多
登录 后发表回答