I need to insert multiple rows with one query (number of rows is not constant), so I need to execute query like this one:
INSERT INTO t (a, b) VALUES (1, 2), (3, 4), (5, 6);
The only way I know is
args = [(1,2), (3,4), (5,6)]
args_str = ','.join(cursor.mogrify("%s", (x, )) for x in args)
cursor.execute("INSERT INTO t (a, b) VALUES "+args_str)
but I want some simpler way.
New
execute_values
method in Psycopg 2.7:The pythonic way of doing it in Psycopg 2.6:
Explanation: If the data to be inserted is given as a list of tuples like in
then it is already in the exact required format as
the
values
syntax of theinsert
clause expects a list of records as ininsert into t (a, b) values (1, 'x'),(2, 'y')
Psycopg
adapts a Pythontuple
to a Postgresqlrecord
.The only necessary work is to provide a records list template to be filled by psycopg
and place it in the
insert
queryPrinting the
insert_query
outputsNow to the usual
Psycopg
arguments substitutionOr just testing what will be sent to the server
Output:
A snippet from Psycopg2's tutorial page at Postgresql.org (see bottom):
It doesn't save much code, but it definitively looks better.
cursor.copy_from is the fastest solution I've found for bulk inserts by far. Here's a gist I made containing a class named IteratorFile which allows an iterator yielding strings to be read like a file. We can convert each input record to a string using a generator expression. So the solution would be
For this trivial size of args it won't make much of a speed difference, but I see big speedups when dealing with thousands+ of rows. It will also be more memory efficient than building a giant query string. An iterator would only ever hold one input record in memory at a time, where at some point you'll run out of memory in your Python process or in Postgres by building the query string.
I've been using ant32's answer above for several years. However I've found that is thorws an error in python 3 because
mogrify
returns a byte string.Converting explicitly to bytse strings is a simple solution for making code python 3 compatible.
If you're using SQLAlchemy, you don't need to mess with hand-crafting the string because SQLAlchemy supports generating a multi-row
VALUES
clause for a singleINSERT
statement:If you want to insert multiple rows within one insert statemens (assuming you are not using ORM) the easiest way so far for me would be to use list of dictionaries. Here is an example:
As you can see only one query will be executed: