I have a batch flash upload script, that uploads video files to a dir. Simple. After the upload completes, it creates a mysql record for that file, and moves on to the next file in the queue.
Just before it does that, I want it invoke a background process that will convert the uploaded avi avi file to an ipod compatible mp4 file, as well as generate some preview thumbs. As you can imagine, it takes some time...... I can simply put the conversion code in the file uploader... but it would hang for every file for a good 10-20 minutes, which is a nono (even thou its an admin-only function).
So I want it to fork the conversion process in the background, and move onto the next upload, while it converts the file.
Would something like this do the job, or will I actually have to use the php fork functions?
exec("/usr/bin/php ./convert.php?id=123 > /dev/null 2>&1 &");
The best way to implement this architecturally is a work queue, with your PHP front end feeding the daemon in the backend files to convert. Decouple PHP from your work, and your UI will always remain responsive.
I haven't written PHP in a long time, but
it's my understanding that any process started falls under the maximum timeout rule,and is run under the Web server. You don't want that to be the case; you certainly don't want a Web request capable of starting additional processes.Write a simple daemon that runs outside the Web server and watches a folder for uploaded files. Your Web front end dumps them there, and your daemon spawns off a thread for each conversion up to how many cores you have. Architecturally, this is the wiser choice.
See my answer to this question as well, which is also relevant.
The php manual page for exec() says:
So, yes, your exec call will do the trick.
It will do the trick but it seems like an unmanageable idea. If you're just going to launch and never look back, things might go wrong really fast.
How about just scheduling a script that runs every few minutes and polls anything that is still in queue. Do you have access to cron?