i have an application written in C++ that uses opencv 2.0, curl and a opensurf library. First a PHP script (cron.php) calls proc_open and calls the C++ application (called icomparer). When it finishes processing N images returns groups saying which images are the same, after that the script uses:
shell_exec('php cron.php > /dev/null 2>&1 &');
die;
And starts again. Well, after 800 or 900 iterates my icomparer starts breaking. The system don't lets me create more files, in icomparer and in the php script.
proc_open(): unable to create pipe Too many open files (2)
shell_exec(): Unable to execute 'php cron.php > /dev/null 2>&1 &'
And curl fails too:
couldn't resolve host name (6)
Everything crashes. I think that i'm doing something wrong, for example, I dunno if starting another PHP process from a PHP process release resources.
In "icomparer" I'm closing all opened files. Maybe not releasing all mutex with mutex_destroy... but in each iterator the c++ application is closed, I think that all stuff is released right?
What I have to watch for? I have tried monitoring opened files with stof.
- Php 5.2
- Centos 5.X
- 1 GB ram
- 120 gb hard disk (4% used)
- 4 x intel xeon
- Is a VPS (machine has 16 gb ram)
- The process opens 10 threads and joins them.