Surely there must be a way to do this easily!
I've tried the Linux command-line apps such as sha1sum
and md5sum
but they seem only to be able to compute hashes of individual files and output a list of hash values, one for each file.
I need to generate a single hash for the entire contents of a folder (not just the filenames).
I'd like to do something like
sha1sum /folder/of/stuff > singlehashvalue
Edit: to clarify, my files are at multiple levels in a directory tree, they're not all sitting in the same root folder.
Try to make it in two steps:
Like so:
Or do it all at once:
If you just want to hash the contents of the files, ignoring the filenames then you can use
Make sure you have the files in the same order when computing the hash:
But you can't have directories in your list of files.
If you just want to check if something in the folder changed, I'd recommend this one:
It will just give you a hash of the ls output, that contains folders, sub-folders, their files, their timestamp, size and permissions. Pretty much everything that you would need to determine if something has changed.
Please note that this command will not generate hash for each file, but that is why it should be faster than using find.
Another tool to achieve this:
http://md5deep.sourceforge.net/
As is sounds: like md5sum but also recursive, plus other features.
I would pipe the results for individual files through
sort
(to prevent a mere reordering of files to change the hash) intomd5sum
orsha1sum
, whichever you choose.You could
sha1sum
to generate the list of hash values and thensha1sum
that list again, it depends on what exactly it is you want to accomplish.