I want to build a neural network where the two first layers are feedforward and the last one is recurrent. here is my code :
model = Sequential()
model.add(Dense(150, input_dim=23,init='normal',activation='relu'))
model.add(Dense(80,activation='relu',init='normal'))
model.add(SimpleRNN(2,init='normal'))
adam =OP.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model.compile(loss="mean_squared_error", optimizer="rmsprop")
and I get this error :
Exception: Input 0 is incompatible with layer simplernn_11: expected ndim=3, found ndim=2.
model.compile(loss='mse', optimizer=adam)
It is correct that in Keras, RNN layer expects input as
(nb_samples, time_steps, input_dim)
. However, if you want to add RNN layer after a Dense layer, you still can do that after reshaping the input for the RNN layer. Reshape can be used both as a first layer and also as an intermediate layer in a sequential model. Examples are given below:Reshape as first layer in a Sequential model
Reshape as an intermediate layer in a Sequential model
For example, if you change your code in the following way, then there will be no error. I have checked it and the model compiled without any error reported. You can change the dimension as per your need.
In Keras, you cannot put a Reccurrent layer after a Dense layer because the Dense layer gives output as (nb_samples, output_dim). However, a Recurrent layer expects input as (nb_samples, time_steps, input_dim). So, a Dense layer gives a 2-D output, but a Recurrent layer expects a 3-D input. However, you can do the reverse, i.e., put a Dense layer after a Recurrent layer.