Has anyone been able to mix feedforward layers and recurrent layers in Tensorflow?
For example: input->conv->GRU->linear->output
I can imagine one can define his own cell with feedforward layers and no state which can then be stacked using the MultiRNNCell function, something like:
cell = tf.nn.rnn_cell.MultiRNNCell([conv_cell,GRU_cell,linear_cell])
This would make life a whole lot easier...
This tutorial gives an example of how to use convolutional layers together with recurrent ones. For example, having last convolution layers like this:
and having defined RNN cell:
You can concatenate both outputs and use it as the input to the next layer:
Or you can just use the output of CNN layer as the input to the RNN cell:
can't you just do the following:
etc.
This is what I have so far; improvements welcome:
(Naturally, don't use with recurrent layers because state-keeping will be destroyed.)
Seems to work with: tf.layers.Conv2D, tf.keras.layers.Conv2D, tf.keras.layers.Activation, tf.layers.BatchNormalization
Does NOT work with: tf.keras.layers.BatchNormalization. At least it failed for me when using it in a tf.while loop; complaining about combining variables from different frames, similar to here. Maybe keras uses tf.Variable() instead of tf.get_variable() ...?
Usage: