I'm using wget to download website content, but wget downloads the files one by one.
How can I make wget download using 4 simultaneous connections?
I'm using wget to download website content, but wget downloads the files one by one.
How can I make wget download using 4 simultaneous connections?
use
xargs
to makewget
working in multiple file in parallelAria2 options, The right way working with file smaller than 20mb
-k 2M
split file into 2mb chunk-k
or--min-split-size
has default value of 20mb, if you not set this option and file under 20mb it will only run in single connection no matter what value of-x
or-s
Wget does not support multiple socket connections in order to speed up download of files.
I think we can do a bit better than gmarian answer.
The correct way is to use
aria2
.Since GNU parallel was not mentioned yet, let me give another way:
A new (but yet not released) tool is Mget. It has already many options known from Wget and comes with a library that allows you to easily embed (recursive) downloading into your own application.
To answer your question:
mget --num-threads=4 [url]
UPDATE
Mget is now developed as Wget2 with many bugs fixed and more features (e.g. HTTP/2 support).
--num-threads
is now--max-threads
.As other posters have mentioned, I'd suggest you have a look at aria2. From the Ubuntu man page for version 1.16.1:
You can use the
-x
flag to specify the maximum number of connections per server (default: 1):If the same file is available from multiple locations, you can choose to download from all of them. Use the
-j
flag to specify the maximum number of parallel downloads for every static URI (default: 5).Have a look at http://aria2.sourceforge.net/ for more information. For usage information, the man page is really descriptive and has a section on the bottom with usage examples. An online version can be found at http://aria2.sourceforge.net/manual/en/html/README.html.
Another program that can do this is
axel
.Ubuntu man page.