I have some functions that use curl to pull information off a couple of sites and insert them into my database. I was just wondering what is the best way to go about executing this task every 24 hours?
I am running off windows now, but will probably switch to linux once I am live (if that makes a difference). I am working inside symfomy framework now.
I hear cronjobs can do this this...but looking at the site it seems to work remotely and I would rather just keep things in house...Can i just "run a service" on my computer? whatever that means ;) (have heard it used)
thanks for any help,
Andrew
This is exactly what Cron (linux) or Scheduled Tasks (windows) are for.
You can run them on your application server to keep everything in one place.
For example, I have a cron running on my home server to backup its MySQL databases every day. Only one system is involved in this process.
Adding 0 0 * * * php /path/to/your/cronjob.php
to your crontab should accomplish this.
You can set a scheduled task in cron (or a scheduled task in windows). The easiest way is to create a shell script (batch script in windows) that executes php script from command line (thanks to this you don't have to use www server resources). Of course you execute the script on the target machine.
If for whatever you decide that cron or windows scheduler is not appropriate, I've sometimes found it handy to write a quick Java app that does the same thing:
You can use the calls System.getRuntime().exec("cmd line stuff here");
. You can then warp that operation in a TimerTask
. Finally, you fire up a Timer
object by adding TimerTasks and specifying the times and frequency etc...
This is clearly more complicated than the above mentioned examples however I like it because you can throw in some intelligent error handling and send yourself email alerts or the like when something screws up.
Probably overkill but possibly worth looking at if you ever have several such operations to deal with.
sweeney