There is a huge two dimensional array which contain 500k sub one dimension arrays, every sub array contain 5 elements. Now it is my job to insert all the data into a sqlite database.
function insert_data($array){
Global $db;
$dbh=new PDO("sqlite:{$db}");
$sql = "INSERT INTO quote (f1,f2,f3,f4,f5) VALUES (?,?,?,?,?)";
$query = $dbh->prepare($sql);
foreach($array as $item){
$query->execute(array_values($item));
}
$dbh=null;
}
I want to optimize the data insert process that the execute action will be executed for 500k times,how to make it executed just one time?
The idea is to prevent running transactions for each insert, because it will be very slow indeed. So just start and commit the transaction, say for every 10k records.
Another solution, you can move an array in a csv file and then just import it.
If you are using a newer version of Sqlite (3.7.11+) then it supports batch inserts:
You can use this to chunk your array into groups, and do batch inserts this way. As pointed out by Axalix you should also wrap the whole operation in a transaction.