I have a password generator:
import random, string
def gen_pass():
foo = random.SystemRandom()
length = 64
chars = string.letters + string.digits
return ''.join(foo.choice(chars) for _ in xrange(length))
According to the docs, SystemRandom uses os.urandom which uses /dev/urandom to throw out random cryto bits. In Linux you can get random bits from /dev/urandom or /dev/random, they both use whatever entropy the kernel can get its hands on. The amount of entropy available can be checked with tail /proc/sys/kernel/random/entropy_avail, this will return a number like: 129. The higher the number more entropy is available. The difference between /dev/urandom and /dev/random is that /dev/random will only spit out bits if entropy_avail is high enough (like at least 60) and /dev/urandom will always spit out bits. The docs say that /dev/urandom is good for crypto and you only have to use /dev/random for ssl certs and the like.
My question is will gen_pass be good for making strong crypto grade passwords always? If I call this function as quickly as possible will I stop getting strong cryto bits at some point because the entropy pool is depleted?
The question could also be why does /dev/urandom always produce strong cryto bits and not care about the entropy_avail?
It is possible that /dev/urandom is designed so that its bandwidth is capped by the number of cycles you can guess will be correlated with an amount of entropy, but this is speculation and I can't find an answer.
Also this is my first stackoverflow question so please critique me. I am concerned that I gave to much background when someone who knows the answer probably knows the background.
Thanks
update
I wrote some code to look at the entropy pool while the /dev/urandom
was being read from:
import subprocess
import time
from pygooglechart import Chart
from pygooglechart import SimpleLineChart
from pygooglechart import Axis
def check_entropy():
arg = ['cat', '/proc/sys/kernel/random/entropy_avail']
ps = subprocess.Popen(arg,stdout=subprocess.PIPE)
return int(ps.communicate()[0])
def run(number_of_tests,resolution,entropy = []):
i = 0
while i < number_of_tests:
time.sleep(resolution)
entropy += [check_entropy()]
i += 1
graph(entropy,int(number_of_tests*resolution))
def graph(entropy,rng):
max_y = 200
chart = SimpleLineChart(600, 375, y_range=[0, max_y])
chart.add_data(entropy)
chart.set_colours(['0000FF'])
left_axis = range(0, max_y + 1, 32)
left_axis[0] = 'entropy'
chart.set_axis_labels(Axis.LEFT, left_axis)
chart.set_axis_labels(Axis.BOTTOM,['time in second']+get_x_axis(rng))
chart.download('line-stripes.png')
def get_x_axis(rng):
global modnum
if len(filter(lambda x:x%modnum == 0,range(rng + 1)[1:])) > 10:
modnum += 1
return get_x_axis(rng)
return filter(lambda x:x%modnum == 0,range(rng + 1)[1:])
modnum = 1
run(500,.1)
If run this and also run:
while 1 > 0:
gen_pass()
Then I pretty reliablly get a graph that looks like this:
Making the graph while running cat /dev/urandom
looks smiler and cat /dev/random
drops off to nothing and stays low very quickly (this also only reads out like a byte every 3 seconds or so)
update
If I run the same test but with six instances of gen_pass(), I get this:
So it looks like something is making it be the case that I have enough entropy. I should measure the password generation rate and make sure that it is actually being capped, because if it is not then something fishy may be going on.
update
I found this email chain
This says that urandom will stop pulling entropy once the pool only has 128 bits in it. This is very consistent with the above results and means that in those tests I am producing junk passwords often.
My assumption before was that if the entropy_avail was high enough (say above 64 bits) then /dev/urnadom
output was good. This is not the case it seems that /dev/urandom
was designed to leave extra entropy for /dev/random
in case it needs it.
Now I need to find out how many true random bits a SystemRandom call needs.