Linux: compute a single hash for a given folder &

2019-01-30 04:01发布

Surely there must be a way to do this easily!

I've tried the Linux command-line apps such as sha1sum and md5sum but they seem only to be able to compute hashes of individual files and output a list of hash values, one for each file.

I need to generate a single hash for the entire contents of a folder (not just the filenames).

I'd like to do something like

sha1sum /folder/of/stuff > singlehashvalue

Edit: to clarify, my files are at multiple levels in a directory tree, they're not all sitting in the same root folder.

标签: linux bash hash
14条回答
甜甜的少女心
2楼-- · 2019-01-30 04:12

Try to make it in two steps:

  1. create a file with hashes for all files in a folder
  2. hash this file

Like so:

# for FILE in `find /folder/of/stuff -type f | sort`; do sha1sum $FILE >> hashes; done
# sha1sum hashes

Or do it all at once:

# cat `find /folder/of/stuff -type f | sort` | sha1sum
查看更多
姐就是有狂的资本
3楼-- · 2019-01-30 04:14

If you just want to hash the contents of the files, ignoring the filenames then you can use

cat $FILES | md5sum

Make sure you have the files in the same order when computing the hash:

cat $(echo $FILES | sort) | md5sum

But you can't have directories in your list of files.

查看更多
Melony?
4楼-- · 2019-01-30 04:17

If you just want to check if something in the folder changed, I'd recommend this one:

ls -alR --full-time /folder/of/stuff | sha1sum

It will just give you a hash of the ls output, that contains folders, sub-folders, their files, their timestamp, size and permissions. Pretty much everything that you would need to determine if something has changed.

Please note that this command will not generate hash for each file, but that is why it should be faster than using find.

查看更多
干净又极端
5楼-- · 2019-01-30 04:17

Another tool to achieve this:

http://md5deep.sourceforge.net/

As is sounds: like md5sum but also recursive, plus other features.

查看更多
Fickle 薄情
6楼-- · 2019-01-30 04:18

I would pipe the results for individual files through sort (to prevent a mere reordering of files to change the hash) into md5sum or sha1sum, whichever you choose.

查看更多
【Aperson】
7楼-- · 2019-01-30 04:18

You could sha1sum to generate the list of hash values and then sha1sum that list again, it depends on what exactly it is you want to accomplish.

查看更多
登录 后发表回答