FTP - Only want to keep latest 10 files - delete L

2019-08-25 02:06发布

问题:

I have created a shell script to backup my webfiles + database dump, put it into a tar archive and FTP it offsite. Id like to run it X times per week however I only want to keep the latest 10 backups on the FTP site.

How can I do this best? Should I be doing this work on the shell script side, or is there an FTP command to check last modified and admin things that way?

Any advice would be appreciated.

Thanks,

回答1:

One way to do something like this would be to use the day of the week in the filename:

  • backup-mon.tgz
  • backup-tue.tgz
  • etc.

Then, when you backup, you would delete or overwrite the backup file for the current day of the week.

(Of course, this way you only get the latest 7 files, but it's a pretty simple method)



回答2:

Do you have shell access to to the FTP server? If so, I'm sure you could write a script to do this, and schedule a cron job to do the periodic clean up.

Here's something that ought to work:

num_files_to_keep=10
i=0

for file in `ls -tr`; do
    if [ $i -ge $num_files_to_keep ]; then
        rm $file;
    fi;
    i=`expr $i + 1`;
done


回答3:

find . \! -newer `ls -t|head -n 10|tail -n 1`

Unfortunately if executed when there are less than 10 files it deletes the oldest file on every execution (because ! -newer tests for "older or equal" insead of "strictly older" - this can be remedied by checking first:

[ `ls|wc -l` -gt 10 ] && find . \! -newer `ls -t|head -n 10|tail -n 1`


回答4:

If you're going down the shell path, try this:

find /your-path/ -mtime +7 -type f -exec rm -rf {} \;

This would delete everything older than a certain date (in this case 7 days). May be more relevant depending on whether you need to keep multiple backups for one day. E.g. yesterday I did ten revisions of the same website.