I am looking to get accurate (i.e. the real size on disk and not the normal size that includes all the 0's) measurements of sparse files in Java.
In C++ on Windows one would use GetCompressedFileSize
. I have yet to come across how one would go about doing that in Java?
If there isn't a direct equivalent, how would I go about measuring the data within a sparse file, as opposed to the size including all of the zeros?
For clarification, I am look for this to run the spare file measurements on both on Linux OS as well as Windows, however I don't mind coding two separate applications!
If you are doing it on Windows alone, you can write it with Java Native Interface
and in C/C++ file:
Since an answer was given for windows. i will try to supply for Linux.
I am not sure, but i think it will do the trick (C++):
This can be loaded in the same way that was described in the @Aniket answer (JNI)
If you want a pure Java solution you can try jnr-posix. Here's an example implementation
However currently the function won't work for Windows. Until that's fixed you have to use platform-specific code like below
On Linux use the
stat64
system callstat
command. The number of allocated blocks can be seen in theBlocks
field, or printed with the%b
format specifierdu
command (without--apparent-size
option)On Windows you can call the
GetCompressedFileSize
APIAlternatively you can also run
fsutil file layout
with admin rights to get detailed information about a file. Find the$DATA
stream.If you see Resident | No clusters allocated in the flags like this then it's a resident file and size on disk would be 0.
If you don't see the resident flag then check the Allocated Size field, it's the file's size on disk
For more information you can read the below questions