I have about 14000 rows of comma separated values that I am trying to insert into a sqlite table using PHP PDO, like so:
<?php
// create a PDO object
$dbh = new PDO('sqlite:mydb.sdb');
$lines = file('/csv/file.txt'); // import lines as array
foreach ($lines as $line) {
$line_array = (','$line); // create an array of comma-separated values in each line
$values = '';
foreach ($line_array as $l) {
$values .= "'$l', ";
}
substr($values,-2,0); // get rid of the last comma and whitespace
$query = "insert into sqlite_table values ($values)"; // plug the value into a query statement
$dbh->query($query); // run the query
}
?>
This query takes a long time, and to run it without interuption, I would have to use PHP-CLI. Is there a better (faster) way to do this?
If you're looking for a bit more speed, use prepare/fetch, so the SQL engine doesn't have to parse out the text string each time.
It's counter-intuitive that you do the binding outside the loop, but this is one reason why this method is so fast, you're basically saying "Execute this pre-compiled query using data from these variables". So it doesn't even need to move the data around internally. And you want to avoid re-parsing the query, which is the problem if you use something like "insert into table (name) values ('$name')", every query sends the entire text string to the database to be re-parsed.
One more thing to speed it up -- wrap the whole loop in a transaction, then commit the transaction when the loop is finished.
Start a transaction before the loop and commit it after the loop
the way your code is working now, it starts a transaction on every insert
From SQLlite FAQ :
I'd say this last paragraph is what you need.
EDIT: No sure about this, but I believe using sqlite_unbuffered_query() should do the trick.
You will see a good performance gain by wrapping your inserts in a single transaction. If you don't do this SQLite treats each insert as its own transaction.