Both TensorFlow
and Theano
do not seem to support cyclic computational graphs, cyclic elements are implemented as recurrent cells with buffer and unrolling (RNN / LSTM cells), but this limitation is mostly related with the computation of back-propagation. I don't have a particular need for computing back-propagation but just the forward propagations.
Is there a way to ignore this limitation, or perhaps just to break down arbitrary computational graphs in acyclic components?