I want to know the mathematical time required for cracking hashes based off different sets of characters.
For example, using only 7 letter, US-ASCII alphabetic characters we know that there are 267 possible sequences that could be used. Knowing how many of these could be generated by a computer each minute would give me an idea of how long it would take to generate all possible hashes and crack a certain 7 character hash (birthday attacks aside).
For example, taking the number above, if a modern quad core could generate 1 million hashes each minute it would take 8031810176 / 1000000 / 60 = 133.86
hours to find all possible hashes in that range.
Also, how does the new Sandy Bridge Intel chips with native AES play into this?
Remember that a GPU can hash 50x - 100x faster than a CPU. Its harder to program, but more efficient. See www.bitcointalk.com for numbers. I know I do 622 million SHA-256's per sec on a Radeon HD5830.
I wrote this test in C using the OpenSSL SHA256 implementation.
Compile:
And results (I know, I'm not very creative with computernames):
So that's
11881376 / 182.4 = 65139
hashes per second. Then it's26^7/101821/3600 = 34
hours to compute all the hashes. Please note, all of this was done on a Q6600 quad-core CPU in a single-threaded application and excluded writing the hashes to file.EDIT
Woops, I was calculating all the hashes of strings with N characters and below. Corrected and data updated.