I've tried everything to change the max_execution_time
of a php crawler script so that it can run an infinite amount of time.
I have changed the php.ini file setting max_execution_time
to 0
or 100000000
but with no change
I've also tried setting it from the php script itself by using ini_set('max_execution_time', 0);
All php scripts throw the same error Fatal error: Maximum execution time of 3000 seconds exceeded
, what could I be missing and how can I make sure there is no max execution time limit?
php script
<?php
ini_set('MAX_EXECUTION_TIME', -1);
error_reporting(E_ALL); // turn on all errors, warnings and notices for easier debugging
//ini_set('max_execution_time', 123456);
ini_set('max_input_time', -1);
ini_set('memory_limit', '512M');
set_time_limit(0);
date_default_timezone_set('Europe/London');
/*code which scrapes websites*/
?>
phpinfo()
max_execution_time 0 0
max_input_time -1 -1
You have to change both of these in you php.ini ( and check if that's the right php.ini by finding the location in
phpinfo();
output! )And after that check if some php file is not overwriting those variables locally.
open php.ini notepad file and search or find
upload_max_filesize = 1000M
and you should change onPost_max_filesize = 1000M
then restart your xampp and refresh the local phpmyadmin..You shouldn't let your crawler run under apache, it's better to run it stand-alone via cli as part of a Gearman setup.
That way it won't hog your web server and it can run as long as you want. You can find many bindings for Gearman that you can use, including PHP of course.
what you are doing is just setting the
max_execution_time
to whatever inside your page. you can't change this using ini_set. you can change thememory_limit
only.see more details here... from the php official site.
if you want them to be changed, change in php.ini.
I found the following in the xampp documentation. Maybe you are trying to edit the wrong php.ini file?