I'm using tensorflow to preprocess some large images. I was having a problem where the memory was rapidly collapsing. I turned to use multiprocessing in python so the memory would free up entirely whenever I want.
The thing is, I'm using python's multiprocess queues and for some reason unknown I can't pass my tensorflow session from my parent process to the children. Using some advanced debugging techniques (i.e. printing something every few lines) I noticed that python just goes idle inside the line where I make use of the session, it doesn't throw an error message.
My code looks something like this:
def subprocess(some_image, sess, q):
with sess.as_default():
# ... use sess and q ...
print "All good and well" #This is printed
some_image.eval() #Nothing happens here in console
print "Still all good and well" #This is not printed
if __name__ == '__main__':
# ... some initial operations ...
some_image = read_some_image()
sess = tf.Session()
q = Queue()
q.put(something)
p = Process(target=subprocess, args=(some_image, sess, q))
p.start()
p.join()
What could be the problem? Many thanks!
All you need is distributed tensorflow.
I don't think you can share "state" as in the
tf.Session()
between processes like that. I would think that each process needed it's own session.