I have many files in my directory. It is very difficult to open one by one and see how many lines they have or how many columns they have.
I want to know if there is any automatic way to do it
As an example. I create a txt file in my desktop and call it my file
check myfile Myname
FALSE 0 Q9Y383
FALSE 1 Q9Y383
FALSE 2 Q9Y383
FALSE 3 Q15366-2
FALSE 6 Q15366-2
FALSE 7 Q15366-2
I paste this in there and so I am sure I have 3 columns and 7 rows (when I open them by xls file)
I tried to do it for one single file like
wc -l mytextfile
it shows 0
This is only one file, what If I have 1000 files ?
Your file has ‘mac’ line endings – that is, lines separated by carriage-return rather than newline (which are ‘unix’ line endings), and it appears that
wc
can recognise only the latter.You have two options: convert your input files to ‘mac’ line endings once, or on the fly.
For example
or
If you have lots of these files, then you could do something like
(if you've pre-converted the input files as above), or
...or something along those lines.
Just use
for
statement.and add things to for loop, when you have any other things to repeat
wc -l file
will show you number of lines; assuming comma-separated values and no literal commas in the header,read -r -d $'\r' -a cols <file && echo "${#cols[@]}"
will give you number of columns (in the first line).All of these will work with wildcards. If you have 1000 files, then, you can run:
...or...
Note that in at least one other question, you had a text file with CR newlines rather than LF or CRLF newlines. For those, you'll want to use
read -r -d $'\r' -a cols
.Similarly, if your text file format prevents
wc -l
from working correctly for that same reason, you might need the following much-less-efficient alternative:Given:
For a single file, you can use
awk
:If you have
gawk
you can handle multiple files (*.ext
) files easily:Which produces (for me)
Edit
If you have ancient Mac files (where the newlines are not some form of
\n
) you can do:Or,