I write a web application in PHP with Symfony2. The user can upload a CSV file with data that is saved to the database. The parsing of each row of the CSV file last about 0.2 seconds because I make some requests to the Google Maps API.
So when you upload a CSV file with 5000 rows, which is a realistic case in my app, it may take 16 minutes to parse the whole file.
I don't want that the user must wait 16 minutes until he can continue using my app. So my question is: how can I parse the CSV file in the background, so that the user can continue surfing?
You could write a script that's just for processing the CSV file, and exec() that script from the script that manages the upload. On *IX systems you can make the command started by exec() run in the background by appending an & character.
You'll probably also want to include a script that will let the user check the progress of the processing.
Have the upload insert a job into a job queue table, and have a Command regularly run by cron that handles any work in the job queue table.
As processing proceeds on a job you can update the job table so a user can check back and see progress happening (eg you could have an ajax progress bar) & can tell when the job is complete.
This way you also de-couple the upload from the processing, and you can control how many jobs are processed at once. Having long-running jobs launch directly from user input without a throttling / queuing system in place is a great way to open yourself up to a denial of service attack...
You can create a
kernel.terminate
event listener and do your parsing there. This event fires after response is sent to the browser. Sample implementation would be,Service declaration,
Listener class,