PHP failed to open stream: Too many open files

2019-05-05 18:49发布

I have a problem with the error: PHP failed to open stream: Too many open files.

I have looked on various answers here on stackoverflow, but I am unable to solve this issue. I have mainly tried to increase the limit of max. open files:

I have edited /etc/security/limits.conf where I specified this:

*       soft    nofile      10000
*       hard    nofile      30000

After saving and logging out / rebooting the box, the command:

ulimit -n

Still prints out 1024. I am not sure why this has no effect and I think this is the reason I get the php error. If needed I can paste the whole file or any other configuration file. I am using PHP 5.6, nginx 1.8.0 and php-fpm.

The solution which works for me now is to manually restart nginx with:

service nginx restart

After this things work again. Mainly the problem occurs when I run unit tests, behat tests or when I make a lot of requests to the web server.

标签: php unix nginx
3条回答
Root(大扎)
2楼-- · 2019-05-05 19:00

You should increase the per user file limit for user running php processes. Check with which user your php processes are running and increase the limit for them. You can do like this. $ cat /etc/security/limits.conf * hard nofile 500000 * soft nofile 500000 root hard nofile 500000 root soft nofile 500000 www-data hard nofile 500000 www-data soft nofile 500000

reference: https://rtcamp.com/tutorials/linux/increase-open-files-limit/

查看更多
来,给爷笑一个
3楼-- · 2019-05-05 19:07

Sounds like a long-running process is opening files and not closing them properly. Do you have any ideas which process might be doing that? Are you doing something that you would expect to open a large number of files? Sounds like it could be an issue with your unit-testing library. I'm not familiar with behat; have you searched for this error specifically in relation to the libraries/software you're using? When you talk about "making a lot of request to the web server", are those all 'concurrent' requests, which might well cause lots of file handles to be opened?

Ultimately, I think you need to solve the problem - if it is, indeed, a problem - of opening way more files than you're expecting to.

查看更多
家丑人穷心不美
4楼-- · 2019-05-05 19:09

The solution was to do vagrant halt and then vagrant ssh again. Then it printed out the 10000. Looks like simple logout and login with the user was not enough for some reason.

查看更多
登录 后发表回答