I'm facing a fatal error while I'm trying to manipulate a huge array of arrays in PHP and return the result as a response of an HTTP POST request:
Allowed memory size of 536870912 bytes exhausted
I have already tried to set ini_set('memory_limit', '-1');
in order to see if I get the result, but I didn't get any type of response. Postman crashed all the times that I tried to make the POST request.
The starting structure of the array is this one. The body size is around 25mb. The main array contains around 22k arrays with this structure, I have just included 2:
Array
(
[0] => Array
(
[id] => 14
[isActive] => 1
[personId] => 0023fff16d353d16a052a267811af53bc8bd42f51f2266a2904ca41db19dfd32_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 17:15:18, 155.59, 294.076; 2018-03-01 17:16:04, 502.968, 249.947; 2018-03-01 17:16:44, 276.837, 270.593; 2018-03-01 17:17:28, 431.68, 371.14; 2018-03-01 17:17:34, 851.622, 355.915
)
[1] => Array
(
[id] => 180
[isActive] => 1
[personId] => 02659982ae8286409cc5bb283089871b62f2bafbbad517941d64e77ecf2b62b1_0
[gender] => m
[age] => 69
[linedata_0] => 2018-03-01 13:20:05, 155.599, 293.841; 2018-03-01 13:20:48, 495.468, 249.582; 2018-03-01 13:21:28, 258.791, 260.748; 2018-03-01 13:23:20, 859.061, 352.237; 2018-03-01 13:23:32, 56.1404, 269.858
)
)
Here below the php part for manipulate the array in order to have the expected final result by explode the timestamp and coordinates for each user:
$final_result = [];
foreach($query_result as $row)
{
$line_datas =explode(";",$row["linedata_0"]);
$linedata = [];
$final = [];
$d = [];
for($s =0; $s < count($line_datas); $s++){
$line_data = explode(",",$line_datas[$s]);
$d["timestamp"] = utf8_encode($line_data[0]);
$d["x"]= utf8_encode($line_data[1]);
$d["y"] = utf8_encode($line_data[2]);
array_push($linedata,$d);
}
$final["id"]= $row["id"];
$final["isActive"]= $row["isActive"];
$final["personId"]= utf8_encode($row["personId"]);
$final["name"] = NULL;
$final["gender"] = utf8_encode($row["gender"]);
$final["age"] = utf8_encode($row["age"]);
$final["linedata"]=$linedata;
array_push($final_result, $final);
}
return $final_result;
As it seems to me there are no infinite loop or bad practices that can justify a memory issue. The only real problem could be the size of the array that need to be manipulated.
Any suggestions?
You are collecting a large amount of data into the array, and only then returning it.
If you instead, collect a single '$final' item, and
yield
it inside the foreach-loop, rather than putting it into an ever-increasing sized variable, you will still be able to foreach around the function call.Here is a simplistic example, where $i stands in as a sample returning value instead of your '$final' array of collected data.
Information on 'yield' in PHP
It's bad practice to work with big datas in this case.
Imagine this thing: U have variable $a which contains 22k arrays, and u started forming second variable $b which will contain 22k arrays too.
So at the end of your script u will have 2 variables with 22k arrays.
To escape these problems you should get your data by batches. For example 500 rows at one loop.
This answer is an example of how to implement a buffer(a limited array in memory) in your code and when it is filled, flush it's contents to disk, at the end you will find a huge array on disk in JSON format. I used this way in a situation similar to yours and got great result regarding "memory usage", but as I told you in comments you need to rethink why you need that HUGE array in the first place, and if there is a way to avoid it, go with it.
using this function will save you the memory used by your
$final_result
array and replace it with $final_result string buffer but we are controlling it's use of memory. However your$query_result
array will still taking the memory it needs.Note that you need to alter the function as you need because I used your variables which are undefined in my code.
this is another simple version of the function for testing