I absolutely love the Keep Remote Directory Up-to-date feature in Winscp. Unfortunately, I can't find anything as simple to use in OS X or Linux. I know the same thing can theoretically be accomplished using changedfiles or rsync, but I've always found the tutorials for both tools to be lacking and/or contradictory.
I basically just need a tool that works in OSX or Linux and keeps a remote directory in sync (mirrored) with a local directory while I make changes to the local directory.
Update
Looking through the solutions, I see a couple which solve the general problem of keeping a remote directory in sync with a local directory manually. I know that I can set a cron task to run rsync every minute, and this should be fairly close to real time.
This is not the exact solution I was looking for as winscp does this and more: it detects file changes in a directory (while I work on them) and then automatically pushes the changes to the remote server. I know this is not the best solution (no code repository), but it allows me to very quickly test code on a server while I develop it. Does anyone know how to combine rsync with any other commands to get this functionality?
I'm using this little Ruby-Script:
Adapt it for your needs!
Building off of icco's suggestion of SVN, I'd actually suggest that if you are using subversion or similar for source control (and if you aren't, you should probably start) you can keep the production environment up to date by putting the command to update the repository into the post-commit hook.
There are a lot of variables in how you'd want to do that, but what I've seen work is have the development or live site be a working copy and then have the post-commit use an ssh key with a forced command to log into the remote site and trigger an svn up on the working copy. Alternatively in the post-commit hook you could trigger an svn export on the remote machine, or a local (to the svn repository) svn export and then an rsync to the remote machine.
I would be worried about things that detect changes and push them, and I'd even be worried about things that ran every minute, just because of race conditions. How do you know it's not going to transfer the file at the very same instant it's being written to? Stumble across that once or twice and you'll lose all of the time-saving advantage you had by constantly rsyncing or similar.
Great question I have searched answer for hours !
I have tested lsyncd and the problem is that the default delay is far too long and no example command line give the
-delay
option.Other problem is that by default rsync ask password each time !
Solution with lsyncd :
other way is to use inotify-wait in a script :
For this second solution you will have to install
inotify-tools
packageTo suppress the need to enter password at each change, simply use
ssh-keygen
: https://superuser.com/a/555800/510714If you are developing python on remote server, Pycharm may be a good choice to you. You can synchronize your remote files with your local files utilizing pycharm remote development feature. The guide link as: https://www.jetbrains.com/help/pycharm/creating-a-remote-server-configuration.html
User watcher.py and rsync to automate this. Read the following step by step instructions here:
http://kushellig.de/linux-file-auto-sync-directories/
Will DropBox (http://www.getdropbox.com/) do what you want?