Consider that you have a large table in database and you want to export it to a CSV file. What's your first idea? I started using PgAdmin - wrote a simple SELECT and ran "Execute to file". But it gave me out "out of memory for query result". It turns out that PgAdmin tries to load the resultset into memory (client side), but fails since memory limitations. Then I tried psql with same results. Then I found really good variable named FETCH_COUNT in http://www.postgresql.org/docs/9.1/static/app-psql.html and here's how I used it:
statistics=#\a statistics=#\f , statistics=#\set FETCH_COUNT 500000 statistics=#select * from MY_HUGE_TABLE statistics-#\g /home/arvids/statistics.csv
- Disable aligned mode, because we don't want extra whitespaces in CSV file
- Then set field separator to comma (it's a CSV file after all)
- Then set fetch count to 500000 so that client (my computer) could load the partial resultset into memory
- Now execute query
- ... and route it to a file
for anyone reading this these days...,
ReplyDeleteif you want results as a CSV file, you probably want to look at just doing
\COPY (SELECT * FROM MY_HUGE_TABLE) TO `/path/to/output.csv` WITH HEADER CSV
that way you can get proper csv
and you probably won't need to deal with FETCH_COUNT either...