I would like to insert multiple rows with a single INSERT
query, for example:
INSERT INTO tmp(col_a,col_b) VALUES('a1','b1'),('a2','b2')...
Is there a way to do this easily, preferably for an array of objects like these:
[{col_a:'a1',col_b:'b1'},{col_a:'a2',col_b:'b2'}]
I might end up with 500 records in one chunk, so running multiple queries would be undesirable.
So far I have been able to do it for a single object only:
INSERT INTO tmp(col_a,col_b) VALUES(${col_a},${col_b})
As a side question: Are insertions using ${}
notation protected against SQL injections?
I'm the author of pg-promise.
In older versions of the library this was covered by simplified examples within the Performance Boost article, which is still an important read when writing high-performance database applications.
The newer approach is to rely on the helpers namespace, which is ultimately flexible, and highly optimized for performance.
See API: ColumnSet, insert.
Such an insert doesn't even require a transaction, because if one set of values fails to insert, none will insert.
And you can use the same approach to generate any of the following queries:
INSERT
INSERT
UPDATE
UPDATE
Yes, but not alone. If you are inserting schema/table/column names dynamically, it is important to use SQL Names, which in combination will protect your code from SQL injection.
Related question: PostgreSQL multi-row updates in Node.js
extras
Q: How to get
id
of each new record at the same time?A: Simply by appending
RETURNING id
to your query, and executing it with method many:or even better, get the id-s, and convert the result into array of integers, using method map:
To understand why we used
+
there, see: pg-promise returns integers as strings.UPDATE-1
For inserting huge number of records, see Data Imports.
UPDATE-2
Using v8.2.1 and later, you can wrap the static query-generation into a function, so it can be generated within the query method, to reject when the query generation fails: