I need to compress a large file of about 17-20 GB. I need to split it into several files of around 1GB per file.
I searched for a solution via Google and found ways using split
and cat
commands. But they did not work for large files at all. Also, they won't work in Windows; I need to extract it on a Windows machine.
Tested code, initially creates a single archive file, then splits it:
This variant omits creating a single archive file and goes straight to creating parts:
In this variant, if the archive's file size is divisible by
$CHUNKSIZE
, then the last partial file will have file size 0 bytes.You can use the
split
command with the-b
option:It can be reassembled on a Windows machine using @Joshua's answer.
Edit: As @Charlie stated in the comment below, you might want to set a prefix explicitly because it will use
x
otherwise, which can be confusing.Edit: Editing the post because question is closed and the most effective solution is very close to the content of this answer:
This solution avoids the need to use an intermediate large file when (de)compressing. Use the tar -C option to use a different directory for the resulting files. btw if the archive consists from only a single file, tar could be avoided and only gzip used:
For windows you can download ported versions of the same commands or use cygwin.
If you are splitting from Linux, you can still reassemble in Windows.
use tar to split into multiple archives
there are plenty of programs that will work with tar files on windows, including cygwin.