What is the best way to insert a python dictionary with with many keys into a Postgres database without having to enumerate all keys?
I would like to do something like...
song = dict()
song['title'] = 'song 1'
song['artist'] = 'artist 1'
...
cursor.execute('INSERT INTO song_table (song.keys()) VALUES (song)')
The new
sql
module was created for this purpose and added in psycopg2 version 2.7. According to the documentation:Two examples are given in the documentation: http://initd.org/psycopg/docs/sql.html
Though string concatenation would produce the same result, it should not be used for this purpose, according to psycopg2 documentation:
an other approach for query to mySQL or pgSQL from dictionary is using construction
%(dic_key)s
, it will be replaced by value from dictionary coresponding by dic_key like{'dic_key': 'dic value'}
working perfect, and prevent sqlInjection tested: Python 2.7 see below:OUT:
INSERT INTO report_template (report_id, report_name, report_description, report_range, datapool_id, category_id, rtype, user_id) VALUES (DEFAULT, E'test suka 1', NULL, NULL, 1, 3, NULL, 6) RETURNING "report_id";
Prints:
Psycopg adapts a
tuple
to arecord
andAsIs
does what would be done by Python's string substitution.You can also insert multiple rows using a
dictionary
. If you had the following:You could insert all three rows within the dictionary by using:
The
cur.executemany
statement will automatically iterate through the dictionary and execute the INSERT query for each row.PS: This example is taken from here
Something along these lines should do it:
The key part is the generated string of
%s
elements, and using that informat
, with the list passed directly to theexecute
call, so that psycopg2 can interpolate each item in thevals
list (thus preventing possible SQL Injection).Another variation, passing the
dict
toexecute
, would be to use these lines instead ofvals
,vals_str_list
andvals_str
from above: