I need to programmatically insert 10's of millions of records into a postgres database. Presently I am executing 1000's of insert statements in a single "query".
Is there a better way to do this, some bulk insert statement I dont know about?
I need to programmatically insert 10's of millions of records into a postgres database. Presently I am executing 1000's of insert statements in a single "query".
Is there a better way to do this, some bulk insert statement I dont know about?
PostgreSQL has a guide on how to best populate a database initially, and they suggest using the COPY command for bulk loading rows. The guide has some other good tips on how to speed up the process, like removing indexes and foreign keys before loading the data (and adding them back afterwards).
I implemented very fast Postgresq data loader with native libpq methods. Try my package https://www.nuget.org/packages/NpgsqlBulkCopy/