I'm passing a large dataset into a MySQL table via PHP using insert commands and I'm wondering if its possible to insert approximately 1000 rows at a time via a query other than appending each value on the end of a mile long string and then executing it. I am using the CodeIgniter framework so its functions are also available to me.
相关问题
- Views base64 encoded blob in HTML with PHP
- Laravel Option Select - Default Issue
- PHP Recursively File Folder Scan Sorted by Modific
- Can php detect if javascript is on or not?
- Using similar_text and strpos together
Well, you don't want to execute 1000 query calls, but doing this is fine:
Depending on your data source, populating the array might be as easy as opening a file and dumping the contents into an array via
file()
.Assembling one
INSERT
statement with multiple rows is much faster in MySQL than oneINSERT
statement per row.That said, it sounds like you might be running into string-handling problems in PHP, which is really an algorithm problem, not a language one. Basically, when working with large strings, you want to minimize unnecessary copying. Primarily, this means you want to avoid concatenation. The fastest and most memory efficient way to build a large string, such as for inserting hundreds of rows at one, is to take advantage of the
implode()
function and array assignment.The advantage of this approach is that you don't copy and re-copy the SQL statement you've so far assembled with each concatenation; instead, PHP does this once in the
implode()
statement. This is a big win.If you have lots of columns to put together, and one or more are very long, you could also build an inner loop to do the same thing and use
implode()
to assign the values clause to the outer array.I have created this simple function which you guys can use easily. You will need to pass the table-name
($tbl)
, table-field($insertFieldsArr)
against your inserting data, data array($arr)
.Although it is too late to answer this question. Here are my answer on the same.
If you are using CodeIgniter then you can use inbuilt methods defined in query_builder class.
$this->db->insert_batch()
Generates an insert string based on the data you supply, and runs the query. You can either pass an array or an object to the function. Here is an example using an array:
);
The first parameter will contain the table name, the second is an associative array of values.
You can find more details about query_builder here
Multiple insert/ batch insert is now supported by codeigniter. I had same problem. Though it is very late for answering question, it will help somebody. That's why answering this question.
I know this is an old query, but I was just reading and thought I'd add what I found elsewhere:
mysqli in PHP 5 is an ojbect with some good functions that will allow you to speed up the insertion time for the answer above:
Turning off autocommit when inserting many rows greatly speeds up insertion, so turn it off, then execute as mentioned above, or just make a string (sqlCombined) which is many insert statements separated by semi-colons and multi-query will handle them fine.
Hope this helps someone save time (searching and inserting!)
R