I made a script that creates a raw query string and then inserts the hundreds of rows in one statement. It works but does not offer the protections that prepared statements do. I then modified my script to add prepared statements. It works however, it is significantly slower. The script with prepared statements takes much longer to insert the rows than the raw query script, due to the script running through each prepared insert statement one row at a time rather than inserting hundred of rows at a time.
Here's a snippet of the prepared statement code:
for( $j = 0; $j < $abilitiesMax - 2; $j++ ){
$stmtAbility->bind_param('iiiii', $abilityArray[$i]["match_id"] , $abilityArray[$i]["player_slot"],
$abilityArray[$i][$j]["ability"], $abilityArray[$i][$j]["time"], $abilityArray[$i][$j]["level"] );
if( !($stmtAbility->execute()) ){
echo "<p>$db->error</p>";
echo "<p>ERROR: when trying to insert abilities query</p>";
}
}
It gets the job done but only after hundreds upon hundreds of inserts. Is there a way to bind lists or arrays to the bind_param() arguments and just run the $stmtAbility->execute one time or some other method that can speed up performance.
Sorry if this has been asked and answered before. I looked around for a while and found some similar questions but nothing that answered what I was asking for explicitly.
It's possible to prepare a bulk insert statement query by constructing it on the fly, but it takes a few tricks. The most important bits are using
str_pad()
to construct a query string of variable length, and usingcall_user_func_array()
to callbind_param()
with a variable number of parameters.This function takes your
$db
as amysqli
instance, a table name, an array of field names, and a flat array of references to values. It inserts up to 500 records per query, re-using prepared statements when possible. It returnstrue
if all of the inserts succeeded, orfalse
if any of them failed. Caveats:$values
is not an even multiple of the length of$fields
, the final chunk will probably fail at the preparation stage.$types
parameter should match the length of$fields
in most cases, particularly when some of them differ.With this function defined, your example code can be replaced with something like:
Those ampersands are important, because
mysqli_stmt::bind_param
expects references, which aren't provided bycall_user_func_array
in recent versions of PHP.You didn't give us the original prepared statement, so you probably need to adjust the table and field names. It also looks like your code sits inside a loop over
$i
; in that case, only thefor
loop needs to be inside the outer loop. If you take the other lines outside the loop, you will use a bit more memory constructing the$inserts
array, in return for much more efficient bulk inserts.It's also possible to rewrite
insertBulkPrepared()
to accept a multi-dimensional array, eliminating one source of potential error, but that requires flattening the array after chunking it.