I am able to insert one record into a table, but i want to insert multiple records at once into the table-
My Code is below-
var doinsert_autocommit = function (conn, cb) {
var query="INSERT INTO test VALUES (:id,:name)";
var values=[{1,'rate'},{5,'ratee'}];
If i use [1,'rat']- its working for inserting one row.
conn.execute(
"INSERT INTO test VALUES (:id,:name)",
values, // Bind values
{ autoCommit: true}, // Override the default non-autocommit behavior
function(err, result)
{
if (err) {
return cb(err, conn);
} else {
console.log("Rows inserted: " + result.rowsAffected); // 1
return cb(null, conn);
}
});
};
I used the simple-oracledb library for batch insert, its extending the oracledb module.
Update 2019/04/25:
The driver, since version 2.2, has built-in support for batch SQL execution. Use
connection.executeMany()
for this when possible. It offers all of the performance benefits with less complexity. See the Batch Statement Execute section of the documentation for more details: https://oracle.github.io/node-oracledb/doc/api.html#batchexecutionPrevious answer:
Currently, the driver only supports array binds with PL/SQL, not direct SQL. We hope to improve this in the future. For now, you can do the following...
Given this table:
The following should work:
That will insert 500 rows with a single roundtrip to the database. Plus, a single context switch between the SQL and PL/SQL engines in the DB.
As you can see, the arrays have to be bound in separately (you can't bind an array of objects). That's why the example demonstrates how to break them up into separate arrays for binding purposes. This should all get more elegant over time, but this works for now.
Checkout the
executeMany()
method introduced in node-oracledb 2.2. This executes one statement with many data value, generally with significant performance benefit over callingexecute()
many times.