My internet speed to github was never fast, and it's lingering at 50kb/s (my internet speed is 20mbit which is not very fast but still much faster than this). The repository is multi-gb by my estimates, so it'll take a very long time.
Does git support downloading the objects using multiple-threads so I can max-out my internet speed?
I have seen similar behavior under Windows, where the cause was the Windows anti-virus which was set to be very aggressive and git was transferring a lot of small files. Dual booting to Linux gave full speed for git clones.
If you have the specifications consider creating a test Windows installation (for instance virtually using VirtualBox or vmware player) where you install a virgin Windows from scratch and only add the necessary git software. You can then see if the problem is "inside" or "outside" git. My personal guess is that this will be fast.
git clone --jobs
This might help if you have multiple submodules:
Added on v2.9.0 (March 2016) at commit 72290d6 :
I wonder if this will help if you have two modules on a single server. TODO benchmark on GitHub.
Definitely should if you have submodules from different servers.
I wonder if Git is smart enough to fetch from different servers as much as possible at a given time instead of possibly putting all jobs on a single server at once randomly.
You can at least try and mitigate the issue, with a shallow clone (meaning not cloning the all history):
Make sure to have a git 1.9+, as I explained in "Is
git clone --depth 1
(shallow clone) more useful than it makes out?".Note: Git 2.5 (Q2 2015) even supports a single fetch commit!See "Pull a specific commit from a remote git repository".
Try
git config --global http.postBuffer 524288000
.