How to import CSV file data into a PostgreSQL tabl

2018-12-31 05:42发布

How can I write a stored procedure that imports data from a CSV file and populates the table?

13条回答
弹指情弦暗扣
2楼-- · 2018-12-31 06:07

Create table and have required columns that are used for creating table in csv file.

  1. Open postgres and right click on target table which you want to load & select import and Update the following steps in file options section

  2. Now browse your file in filename

  3. Select csv in format

  4. Encoding as ISO_8859_5

Now goto Misc. options and check header and click on import.

查看更多
萌妹纸的霸气范
3楼-- · 2018-12-31 06:10
COPY table_name FROM 'path/to/data.csv' DELIMITER ',' CSV HEADER;
查看更多
弹指情弦暗扣
4楼-- · 2018-12-31 06:11

As Paul mentioned, import works in pgAdmin:

right click on table -> import

select local file, format and coding

here is a german pgAdmin GUI screenshot:

pgAdmin import GUI

similar thing you can do with DbVisualizer (I have a license, not sure about free version)

right click on a table -> Import Table Data...

DbVisualizer import GUI

查看更多
何处买醉
5楼-- · 2018-12-31 06:13

If you don't have permission to use COPY (which work on the db server), you can use \copy instead (which works in the db client). Using the same example as Bozhidar Batsov:

Create your table:

CREATE TABLE zip_codes 
(ZIP char(5), LATITUDE double precision, LONGITUDE double precision, 
CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);

Copy data from your CSV file to the table:

\copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV

You can also specify the columns to read:

\copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV
查看更多
不再属于我。
6楼-- · 2018-12-31 06:14

One quick way of doing this is with the Python pandas library (version 0.15 or above works best). This will handle creating the columns for you - although obviously the choices it makes for data types might not be what you want. If it doesn't quite do what you want you can always use the 'create table' code generated as a template.

Here's a simple example:

import pandas as pd
df = pd.read_csv('mypath.csv')
df.columns = [c.lower() for c in df.columns] #postgres doesn't like capitals or spaces

from sqlalchemy import create_engine
engine = create_engine('postgresql://username:password@localhost:5432/dbname')

df.to_sql("my_table_name", engine)

And here's some code that shows you how to set various options:

# Set it so the raw sql output is logged
import logging
logging.basicConfig()
logging.getLogger('sqlalchemy.engine').setLevel(logging.INFO)

df.to_sql("my_table_name2", 
          engine, 
          if_exists="append",  #options are ‘fail’, ‘replace’, ‘append’, default ‘fail’
          index=False, #Do not output the index of the dataframe
          dtype={'col1': sqlalchemy.types.NUMERIC,
                 'col2': sqlalchemy.types.String}) #Datatypes should be [sqlalchemy types][1]
查看更多
流年柔荑漫光年
7楼-- · 2018-12-31 06:16

Take a look at this short article.


Solution paraphrased here:

Create your table:

CREATE TABLE zip_codes 
(ZIP char(5), LATITUDE double precision, LONGITUDE double precision, 
CITY varchar, STATE char(2), COUNTY varchar, ZIP_CLASS varchar);

Copy data from your CSV file to the table:

COPY zip_codes FROM '/path/to/csv/ZIP_CODES.txt' WITH (FORMAT csv);
查看更多
登录 后发表回答