I am a student conducting some research which involves a sort of data mining. I have several volunteer “node” servers which gather and produce SQL files for me to import on my server and analyze.
The problem is, these are very big files, and I need a way to import them quickly. The network recently expanded, and now there just isn’t enough throughput on the hard drive for the MySQL console to import them as they come in. And time is important – there is a deadline for the research to be in, and I want to be actively gathering for as much time as possible beforehand and not have a queue waiting to be inserted.
I am wondering if there is a better way to import very large files – each one weighs in at about 100 MB. I’ve tried “. myfile.sql”” but that is incredibly slow. PHPMyAdmin won’t take files that big.
Any idea? Thanks!
Advertisement
Answer
have you tried mysql -uYOU -p < myfile.sql
?
UPD:
even mysql -uYOU -p < myfile.sql &
if you have short-live remote console session
UPD2:
But most efficient way it’s using mysqlimport
as PinnyM advised.
Assuming name_same_as_table.txt
is text file with DOS-style EOLs and tab-separated fields. Count and type of fields must be the same as in destination table.
mysqlimport -uYOU -p --lock-tables --lines-terminated-by="rn" --fields-terminated-by="t" YOUR_DB name_same_as_table.txt