I am working on a script to create files basically replica of some tables and ftp those files to the remote machine.there is one more requirement to delete the 3days old files on the remote machine before dumping these files.
I need help in writing the ksh for deleting 3 days old files on a remote machine using ftp
Normally, you would use:
or something similar (i.e., there may be other limiting clauses like
-type f
for regular files or-maxdepth 0
to do current directory only, no subdirectories). The-mtime +3
only gets those files whose modification date is 3 days age or more.Execute
man find
on your system for full details. Whether Solaris has the same features as GNU find I don't know. It may be more limited (or better).Update: Please, in the name of whatever gods you worship, please test the command first with
echo
instead ofrm
. I take no responsibility for the destruction of your files if you trust the advice of "some random guy on the net who may or may not have your best interests at heart" :-)And, before anyone jumps in and berates me for not using
xargs
(or, better yet,find -print0
withxargs -0
where available), I know. But it's not relevant to the specific question at hand. The OP can ask another question if and when the performance of thefind -exec
is a problem.If you have a specific file format with the date in it (as you indicate in your comment), you can actually use
mdel
underftp
. Consider the following script:Other than a slight annoyance where you may have up to six days of files left there on month boundaries, this does what you need. You could of course make the code handling the month boundaries a little more intelligent if that's really important.
Just run this script on your box each day and it will clear out files on the target box by using standard
ftp
tooling. I still think it's easier to run afind
on the server box but I'll present this option if that avenue is not available.