I was running the sketch_rnn.ipynb on my jupyter notebook, upon loading the environment to load the trained dataset, it returned an error 'Object arrays cannot be loaded when allow_pickle=False'
This is the code already used by google developers in developing the sketch_rnn algorithm that was even run in the google colab. In the past i have ran it myself on the google colab it worked but seems not to be working on my own jupyter notebook
from magenta.models.sketch_rnn.sketch_rnn_train import *
from magenta.models.sketch_rnn.model import *
from magenta.models.sketch_rnn.utils import *
from magenta.models.sketch_rnn.rnn import *
model_params.batch_size = 1
eval_model_params = sketch_rnn_model.copy_hparams(model_params)
eval_model_params.use_input_dropout = 0
eval_model_params.use_recurrent_dropout = 0
eval_model_params.use_output_dropout = 0
eval_model_params.is_training = 0
sample_model_params = sketch_rnn_model.copy_hparams(eval_model_params)
sample_model_params.max_seq_len = 1
return [model_params, eval_model_params, sample_model_params]
[train_set, valid_set, test_set, hps_model, eval_hps_model,
sample_hps_model] = load_env_compatible(data_dir, model_dir)
i expected the output to be
INFO:tensorflow:Downloading http://github.com/hardmaru/sketch-rnn-
datasets/raw/master/aaron_sheep/aaron_sheep.npz
INFO:tensorflow:Loaded 7400/300/300 from aaron_sheep.npz
INFO:tensorflow:Dataset combined: 8000 (7400/300/300), avg len 125
INFO:tensorflow:model_params.max_seq_len 250.
total images <= max_seq_len is 7400
total images <= max_seq_len is 300
total images <= max_seq_len is 300
INFO:tensorflow:normalizing_scale_factor 18.5198.
But it gave me
ValueError: Object arrays cannot be loaded when allow_pickle=False
Use allow_pickle=True as one of the arguments to np.load().
So I believe this has just surfaced due to a change in numpy to load(), if you observe the line that the error occurs it references something like
but the Keras source code, for example here at line 58: https://github.com/keras-team/keras/blob/master/keras/datasets/imdb.py
now uses
where
np.load(path)
becomesnp.load(path, boolean)
From brief reading, the addition of
pickles
has to do with security, sincepickles
can contain arbitrary Python code that would be run when something is loaded. (Possibly similar to the way SQL injections are performed)After updating np.load with the new param list, it's working for my project
This code solved the problem at my side.
I just downgrade numpy as the problem is due to some internal conflict.