I have a large Subversion repository with nearly 15 GB of data spread across ~500,000 files. Now I need to check out this repository to a remote host which would take days to complete.
The host I'm checking out to already has a complete copy of the data in the repository. But seeing as the files weren't checked out directly from the repository, they do not constitute a working copy (no ".svn" folders).
I'd like to avoid copying all this data across the network, especially when it already exists on the target host. Is there a trick I can use that will turn a preexisting directory into a working copy without replacing the local files with identical copies from the repository?
svn co --force https://PATH/TO/REPO/ .
Where the
.
at the end assumes you are already inside the directory which you want to turn into a working SVN copy.For instances, if you wanted to make your
public_html
directory a working svn copy of a repository:cd /home/username/public_html; svn co --force https://PATH/TO/REPO/ .
I had a working repository on my local computer that got all of its .svn folders deleted when Eclipse crashed.
The only way I was able to connect it to the remote SVN repository was to follow these steps from a blog I found (Recovering a broken Subversion working copy):
None of the native svn commands will checksum existing files for matches and not download them.
What protocol are you using to access the repository? If it is https, that may be your problem. Try the native svn protocol (using svnserve), or svn+ssh. Or perhaps even do a checkout via a file:// URL on the server hosting the svn repo, andthen use rsync to transfer across the network.
So long as you are not paying for the bandwidth per-byte, it might just make sense to let "svn co --force" run under nice or (START /LOW on Windows) and not waste your own time. It will not make anything on the local filesystem unavailable during the checkout process.
Finally, I can't figure out why your checkout is so slow... we have 500K file repositories that checkout in ~6 minutes via https on a gigabit LAN. Granted all the files are much smaller (1 GB total). How far away from the server are you in terms of latency?
There is the relocate command: http://svnbook.red-bean.com/en/1.1/re27.html
EDIT: If the local files aren't linked to a repository then you could create a local repository, import the files into it and then use the relocate command.
Alternatively if you have physical access to both machines you can you check out the repository locally and then copy the files to the remote machine via a external HD.
I don't believe there's a solution without transferring those 15 GBs to the destination. It would probably be faster and easier to copy the repository there and do a local checkout.
You could try using rsync if you already have a working copy under svn control somewhere else on the network.