Large number of SQLite inserts using PHP

2019-02-19 10:10发布

I have about 14000 rows of comma separated values that I am trying to insert into a sqlite table using PHP PDO, like so:

<?php
// create a PDO object
$dbh = new PDO('sqlite:mydb.sdb');

$lines = file('/csv/file.txt'); // import lines as array
foreach ($lines as $line) {
    $line_array = (','$line); // create an array of comma-separated values in each line
    $values = '';
    foreach ($line_array as $l) {
        $values .= "'$l', ";
    }
    substr($values,-2,0); // get rid of the last comma and whitespace
    $query = "insert into sqlite_table values ($values)"; // plug the value into a query statement
    $dbh->query($query); // run the query
}

?>

This query takes a long time, and to run it without interuption, I would have to use PHP-CLI. Is there a better (faster) way to do this?

标签: php sqlite3 pdo
4条回答
Root(大扎)
2楼-- · 2019-02-19 10:54

If you're looking for a bit more speed, use prepare/fetch, so the SQL engine doesn't have to parse out the text string each time.

$name = $age = '';
$insert_stmt = $db->prepare("insert into table (name, age) values (:name, :age)");
$insert_stmt->bindValue(':name', $name);
$insert_stmt->bindValue(':age', $age);

// do your loop here, like fgetcsv
while (get the data) {
list($name, $age) = split(',', $string);
$insert_stmt->execute();
}

It's counter-intuitive that you do the binding outside the loop, but this is one reason why this method is so fast, you're basically saying "Execute this pre-compiled query using data from these variables". So it doesn't even need to move the data around internally. And you want to avoid re-parsing the query, which is the problem if you use something like "insert into table (name) values ('$name')", every query sends the entire text string to the database to be re-parsed.

One more thing to speed it up -- wrap the whole loop in a transaction, then commit the transaction when the loop is finished.

查看更多
一纸荒年 Trace。
3楼-- · 2019-02-19 10:57

Start a transaction before the loop and commit it after the loop
the way your code is working now, it starts a transaction on every insert

查看更多
The star\"
4楼-- · 2019-02-19 10:59

From SQLlite FAQ :

Transaction speed is limited by disk drive speed because (by default) SQLite actually waits until the data really is safely stored on the disk surface before the transaction is complete. That way, if you suddenly lose power or if your OS crashes, your data is still safe. For details, read about atomic commit in SQLite.. [...]

Another option is to run PRAGMA synchronous=OFF. This command will cause SQLite to not wait on data to reach the disk surface, which will make write operations appear to be much faster. But if you lose power in the middle of a transaction, your database file might go corrupt.

I'd say this last paragraph is what you need.

EDIT: No sure about this, but I believe using sqlite_unbuffered_query() should do the trick.

查看更多
【Aperson】
5楼-- · 2019-02-19 11:02

You will see a good performance gain by wrapping your inserts in a single transaction. If you don't do this SQLite treats each insert as its own transaction.

<?php
// create a PDO object
$dbh = new PDO('sqlite:mydb.sdb');

// Start transaction
$dbh->beginTransaction();
$lines = file('/csv/file.txt'); // import lines as array
foreach ($lines as $line) {
    $line_array = (','$line); // create an array of comma-separated values in each line
    $values = '';
    foreach ($line_array as $l) {
        $values .= "'$l', ";
    }
    substr($values,-2,0); // get rid of the last comma and whitespace
    $query = "insert into sqlite_table values ($values)"; // plug the value into a query statement
    $dbh->query($query); // run the query
}
// commit transaction
$dbh->commit();

?>
查看更多
登录 后发表回答