I have a directory with thousands of files (100K for now). When I use wc -l ./*
, I'll get:
c1 ./test1.txt
c2 ./test2.txt
...
cn ./testn.txt
c1+c2+...+cn total
Because there are a lot of files in the directory, I just want to see the total count and not the details. Is there any way to do so?
I tried several ways and I got following error:
Argument list too long
Below command will provide the total count of lines from all files in path
Credit: this builds on @lifecrisis's answer, and extends it to handle large numbers of files:
find
will find all of the files in the current directory, break them into groups as large as can be passed as arguments, and runcat
on the groups.This will give you the total count for all the files (including hidden files) in your current directory :
To count for files excluding hidden files use :
If what you want is the total number of lines and nothing else, then I would suggest the following command:
This catenates the contents of all of the files in the current working directory and pipes the resulting blob of text through
wc -l
.I find this to be quite elegant. Note that the command produces no extraneous output.
UPDATE:
I didn't realize your directory contained so many files. In light of this information, you should try this command:
Most people don't know that you can pipe the output of a
for
loop directly into another command.Beware that this could be very slow. If you have 100,000 or so files, my guess would be around 10 minutes. This is a wild guess because it depends on several parameters that I'm not able to check.
If you need something faster, you should write your own utility in C. You could make it surprisingly fast if you use pthreads.
Hope that helps.
LAST NOTE:
If you're interested in building a custom utility, I could help you code one up. It would be a good exercise, and others might find it useful.
Would be an interesting comparison to find out how many lines don’t end with a new line.
Combining the awk and Gordon’s find solutions and avoiding the “.” files.
No idea if this is better or worse but it does give a more accurate count (for me) and does not count lines in “.” files. Using ./* is just a guess that appears to work.
Still need depth and ./* requires “0” depth.
I did get the same result with the “cat” and “awk” solutions (using the same find) since the “cat *” takes care of the new line issue. I don’t have a directory with enough files to measure time. Interesting, I’m liking the “cat” solution.
iF you want to know only total number Lines in directory excluding total line
Previous comment will give total count of lines which includes only count of lines in all files