I got the following error when I tried to train an MLP model in keras(I am using keras version 1.2.2
)
Error when checking model input: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 1 arrays but instead got the following list of 12859 arrays:
This is the summary of the model
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
dense_1 (Dense) (None, 20) 4020 dense_input_1[0][0]
____________________________________________________________________________________________________
dense_2 (Dense) (None, 2) 42 dense_1[0][0]
====================================================================================================
Total params: 4,062
Trainable params: 4,062
Non-trainable params: 0
____________________________________________________________________________________________________
None
This is the first line of model
model.add(Dense(20, input_shape=(200,), init='lecun_uniform', activation='tanh'))
For training:
model.fit(X,Y,nb_epoch=100,verbose=1)
where X is a list of elements and each element in turn is a list of 200 values.
Edit :
I also tried
model.add(Dense(20, input_shape=(12859,200), init='lecun_uniform', activation='tanh'))
but I am getting the same error
Your error comes from the fact that your
X
for some reason wasn't transformed to anumpy.array
. In this yourX
is treated as a list of rows and this is a reason behind your error message (that it expected one input instead of list which has a number of rows elements). Transformation:I would check a data loading process because something might go wrong there.
UPDATE:
As it was mentioned in a comment -
input_shape
need to be changed toinput_dim
.UPDATE 2:
In order to keep
input_shape
one should change to it toinput_shape=(200,)
.I fixed mine by adding
to train_X , train_Y , valid_X and valid_Y. For example,
I got the help from here. This approach is likely to have a slow run because all data features will have to be converted to numpy arrays and it could be a lot of work for your system.