I am using laravel 5.6
My script to insert big data is like this :
...
$insert_data = [];
foreach ($json['value'] as $value) {
$posting_date = Carbon::parse($value['Posting_Date']);
$posting_date = $posting_date->format('Y-m-d');
$data = [
'item_no' => $value['Item_No'],
'entry_no' => $value['Entry_No'],
'document_no' => $value['Document_No'],
'posting_date' => $posting_date,
....
];
$insert_data[] = $data;
}
\DB::table('items_details')->insert($insert_data);
I have tried to insert 100 record with the script, it works. It successfully insert data
But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :
504 Gateway Time-out
How can I solve this problem?
It makes no sense to use an array and then convert it to a collection.
We can get rid of arrays.
As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:
1 - Reorganise your code to properly use chunks, this will look something like this:
This way your bulk insert will contain less data, and be able to process it in a rather quick way.
2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :
To read more go to the official docs