Skip to content
Advertisement

How to open and work with a very large .SQL file that was generated in a dump?

I have a very large .SQL file, of 90 GB

It was generated with a dump on a server:

mysqldump -u root -p siafi > /home/user_1/siafi.sql

I downloaded this .SQL file on a computer with Ubuntu 16.04 and MySQL Community Server (8.0.16). It has 8GB of RAM

So I did these steps in Terminal:

# Access
/usr/bin/mysql -u root -p

# I create a database with the same name to receive the .SQL information
CREATE DATABASE siafi;

# I establish the privileges. User reinaldo
GRANT ALL PRIVILEGES ON siafi.* to reinaldo@localhost;

# Enable the changes
FLUSH PRIVILEGES;

# Then I open another terminal and type command for the created database to receive the data from the .SQL file
mysql --user=reinaldo --password="type_here" --database=siafi < /home/reinaldo/Documentos/Code/test/siafi.sql

I typed these same commands with other .SQL files, only minor ones, with a maximum of 2GB. And it worked normally

But this 90GB file is processing for over twelve hours without stopping. I do not know if it’s working

Please, is there any more efficient way to do this? Maybe splitting the .SQL file?

Advertisement

Answer

2 things that might be helpful:

  1. Use pv to see how much of the .sql file has already been read. This can give you a progress bar which at least tells you it’s not suck.
  2. Log into MySQL and use SHOW PROCESSLIST to see what MySQL currently is executing. If it’s still running, just let it run to completion.

If turned on, it might really help to turn off the binlog for the duration of the restore. Another thing that may or may not be helpful… if you have the choice, try to use the fastest disks available. You may have this kind of option if you’re running on hosters like Amazon. You’re going to really feel the pain if you’re (for example) doing this on a standard EC2 host.

User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement