Checkpoints in Google Colab

2019-06-08 05:09发布

How do I store my trained model on Google Colab and retrieve further on my local disk? Will checkpoints work? How do I store them and retrieve them after some time? Can you please mention code for that. It would be great.

2条回答
虎瘦雄心在
2楼-- · 2019-06-08 05:48

Google Colab instances are created when you open the notebook and are deleted later on so you can't access data on different runs. If you want to download the trained model to your local machine you can use:

from google.colab import files
files.download(<filename>)

And similarly if you want to upload the model from your local machine you can do:

from google.colab import files
files.upload(<filename>)

Another possible (and better in my opinion) solution is to use a github repo to store your models and simply commit and push your models to github and clone the repo later to get the models back.

查看更多
干净又极端
3楼-- · 2019-06-08 06:14

Ok thats works for me

> import os 
> checkpoint_path = "training_1\cp.ckpt" 

> checkpoint_dir = os.path.dirname(checkpoint_path)

 # Create checkpoint  callback 
> cp_callback =ModelCheckpoint(checkpoint_path, 
     monitor='val_acc',save_best_only=True,save_weights_only=True,verbose=1)

> network_fit = myModel.fit(x, y, batch_size=25, epochs=20,
                                  ,callbacks = [cp_callback] )

By this code you can monitor val_acc and save weights on that epoch if it decrease. Now you can access this wights and load that in model by this code

myModel.load_weights(checkpoint_path)

You can check how to use it here https://colab.research.google.com/github/tensorflow/models/blob/master/samples/core/tutorials/keras/save_and_restore_models.ipynb#scrollTo=gXG5FVKFOVQ3

查看更多
登录 后发表回答