I would like to train a CNN with a large dataset. Currently I load all data into tf.constant and then loop through it with a small Batch size in tf.Session(). That works fine for a small fraction of the dataset, but when I increase the input size I get the error:
ValueError: Cannot create a tensor proto whose content is larger than 2GB.
How can I avoid that?
Do not load data to constant, it will be part of your computational graph.
You should rather:
For TensorFlow 1.x and Python 3, there is my simple solution:
In practice, you will mostly specify Graph and Session for continuous computation, this following code will help you: