I need to extract SQL files from multiple tables of a PostgreSQL database. This is what I’ve come up with so far:
pg_dump -t 'thr_*' -s dbName -U userName > /home/anik/psqlTest/db_dump.sql
However, as you see, all the tables that start with the prefix thr
are being exported to a single unified file (db_dump.sql
). I have almost 90 tables in total to extract SQL from, so it is a must that the data be stored into separate files.
How can I do it? Thanks in advance.
Advertisement
Answer
If you are happy to hard-code the list of tables, but just want each to be in a different file, you could use a shell script loop to run the pg_dump
command multiple times, substituting in the table name each time round the loop:
for table in table1 table2 table3 etc; do pg_dump -t $table -U userName dbName > /home/anik/psqlTest/db_dump_dir/$table.sql; done;
EDIT: This approach can be extended to get the list of tables dynamically by running a query through psql and feeding the results into the loop instead of a hard-coded list:
for table in $(psql -U userName -d dbName -t -c "Select table_name From information_schema.tables Where table_type='BASE TABLE' and table_name like 'thr_%'"); do pg_dump -t $table -U userName dbName > /home/anik/psqlTest/db_dump_dir/$table.sql; done;
Here psql -t -c "SQL"
runs SQL
and outputs the results with no header or footer; since there is only one column selected, there will be a table name on each line of the output captured by $(command)
, and your shell will loop through them one at a time.