Both TensorFlow
and Theano
do not seem to support cyclic computational graphs, cyclic elements are implemented as recurrent cells with buffer and unrolling (RNN / LSTM cells), but this limitation is mostly related with the computation of back-propagation. I don't have a particular need for computing back-propagation but just the forward propagations.
Is there a way to ignore this limitation, or perhaps just to break down arbitrary computational graphs in acyclic components?
TensorFlow does support cyclic computation graphs. The
tf.while_loop()
function allows you to specify a while loop with arbitrary subgraphs for the condition and the body of the loop, and the runtime will execute the loop in parallel. Thetf.scan()
function is a higher-level API that is similar to Theano'stheano.scan()
function. Both allow you to loop over tensors of dynamic size.