I have a PostgreSQL schema stored in .sql file. It looks something like:
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY,
facebook_id TEXT NOT NULL,
name TEXT NOT NULL,
access_token TEXT,
created INTEGER NOT NULL
);
How shall I run this schema after connecting to the database?
My existing Python code works for SQLite databases:
# Create database connection
self.connection = sqlite3.connect("example.db")
# Run database schema
with self.connection as cursor:
cursor.executescript(open("schema.sql", "r").read())
But the psycopg2 doesn't have an executescript
method on the cursor. So, how can I achieve this?
You can just use execute
:
with self.connection as cursor:
cursor.execute(open("schema.sql", "r").read())
though you may want to set psycopg2 to autocommit
mode first so you can use the script's own transaction management.
It'd be nice if psycopg2 offered a smarter mode where it read the file in a statement-at-a-time and sent it to the DB, but at present there's no such mode as far as I know. It'd need a fairly solid parser to do it correctly when faced with $$
quoting (and its $delimiter$
variant where the deimiter may be any identifier), standard_conforming_strings
, E''
strings, nested function bodies, etc.
Note that this will not work with:
- anything containing
psql
backslash commands
- COPY .. FROM STDIN
- very long input
... and therefore won't work with dumps from pg_dump
I can't reply to comments of the selected answer by lack of reputation, so i'll make an answer to help with the COPY
issue.
Depending on the volume of your DB,pg_dump --inserts
outputs INSERT
s instead of COPY
s