How to optimize the times of execute when to inser

2019-09-06 20:53发布

There is a huge two dimensional array which contain 500k sub one dimension arrays, every sub array contain 5 elements. Now it is my job to insert all the data into a sqlite database.

function insert_data($array){
    Global $db;
    $dbh=new PDO("sqlite:{$db}");
    $sql = "INSERT INTO quote (f1,f2,f3,f4,f5) VALUES (?,?,?,?,?)";
    $query = $dbh->prepare($sql);
    foreach($array as $item){
        $query->execute(array_values($item));
            }
    $dbh=null;
        }

I want to optimize the data insert process that the execute action will be executed for 500k times,how to make it executed just one time?

2条回答
看我几分像从前
2楼-- · 2019-09-06 21:37

The idea is to prevent running transactions for each insert, because it will be very slow indeed. So just start and commit the transaction, say for every 10k records.

$dbh->beginTransaction();
$counter = 0;
foreach($array as $item) {
    $query->execute(array_values($item));
    if ($counter++ % 10000 == 0) {
        $dbh->commit();
        $dbh->beginTransaction();
    }
}
$dbh->commit();

Another solution, you can move an array in a csv file and then just import it.

查看更多
我命由我不由天
3楼-- · 2019-09-06 21:42

If you are using a newer version of Sqlite (3.7.11+) then it supports batch inserts:

INSERT INTO quote (f1,f2,f3,f4,f5) VALUES
  (?,?,?,?,?),
  (?,?,?,?,?),
  (?,?,?,?,?);

You can use this to chunk your array into groups, and do batch inserts this way. As pointed out by Axalix you should also wrap the whole operation in a transaction.

查看更多
登录 后发表回答