I was working on hyperparameter optimization for neural network. I was running the model for 20 epochs. After figuring out the best hyperparameters, I ran the same model again alone (now no hyperparameter optimization) but I got different results. Not only that, I figured out that the value (accuracy) reached while performing hyperparameter optimization occured at the last epoch (20th). On the other hand, when I ran the same model again, I figured out that accuracy achieved was not until 200 epochs. Yet, the values were slightly less. Below is the figure:
Therefore, I would like to know what was the random seed chosen by tensorflow at that moment. As a result, I am not interested in setting the random seed to a certain constant, but I would like to see what was chosen by tensorflow.
Your help is much appreciated!!
This question is very similar, but it does not have an answer, see the comments thread. In general, you cannot "extract the seed" at any given time, because there is no seed once the RNG has started working.
If you just want to see the initial seed, you need to understand there are graph-level and op-level seeds (see tf.set_random_seed
, and the implementation in random_seed.py
):
- If both are set then both are combined to produce the actual seed.
- If the graph seed is set but the op seed is not, the seed is determined deterministically from the graph seed and the "op id".
- If the op seed is set but the graph seed is not, then a default graph seed is used
- If none of them is set, then a random seed is produced. To see where this comes from you would look at
GuardedPhiloxRandom
which provides the two numbers that are finally used by PhiloxRandom
. In the case that no seed at all is provided, picks two random value generated from /dev/urandom
, as seen in random.cc
You can actually see these, by the way, when they are set. You just need to access the specific random operation that you are interested in and read its attributes seed
and seed2
. Note that TensorFlow public functions return the result of a few operations (scaling, displacing), so you have to "climb up" the graph a bit to get to the interesting one:
import tensorflow as tf
def print_seeds(random_normal):
# Get to the random TensorFlow op (RandomStandardNormal) and print seeds
random_op = random_normal.op.inputs[0].op.inputs[0].op
print(random_op.get_attr('seed'), random_op.get_attr('seed2'))
print_seeds(tf.random_normal(()))
# 0 0
print_seeds(tf.random_normal((), seed=100))
# 87654321 100
tf.set_random_seed(200)
print_seeds(tf.random_normal(()))
# 200 15
print_seeds(tf.random_normal((), seed=300))
# 200 300
Unfortunately, when the seed is unspecified there is no way to retrieve the random values generated by TensorFlow. The two random numbers are passed to PhiloxRandom
, which uses them to initialize its internal key_
and counter_
variables, which cannot be read out anyhow.