Let’s see how to use PostgreSQL to import and export CSV files painlessly with the COPY command.
Import CSV into table t_words:
COPY t_words FROM '/path/to/file.csv' DELIMITER ',' CSV;You can tell quote char with QUOTE and change delimiter with DELIMITER.
Import CSV into table t_words, telling what columns to use:
COPY t_words("name", "count", "def") FROM 'file.csv' DELIMITER ',' CSV; Export table to a CSV file:
COPY t_words TO 'file.csv' DELIMITER ',' CSV HEADER; Export custom results to a CSV file:
COPY (SELECT word, def FROM t_words) TO 'file.csv' CSV;

It is required to quote your data if you have
timestampfields in your DB.To export a table to CSV use this
psqlcommand with headers and quoting all fields with double-quote:To import a CSV to DB table:
COPYis very quick, I had to recently load 78 million rows of financial data from a 9GB CSV file. Using the Pythonpandaslibrary it took 2 days to finish. On the same machine, writing to CSV frompandasand usingcopytook only 4 hours - 1 hour for ingest and 3 hours for creating indexes. Exporting to CSV was limited only by disk performance - 8 minutes in my case.COPYto binary format produces larger files than CSV and offers similar performance.Alternate solutions using
pg_dumpare cumbersome as you cannot restore to a different table. You will have to rename the original table, runpg_restoreand then rename the tables again.