Skip to content
Advertisement

How to run OPTIMIZE TABLE with the least downtime

I have a MySQL 5.5 DB of +-40GB on a 64GB RAM machine in a production environment. All tables are InnoDB. There is also a slave running as a backup.

One table – the most important one – grew to 150M rows, inserting and deleting became slow. To speed up inserting and deleting I deleted half of the table. This did not speed up as expected; inserting and deleting is still slow.

I’ve read that running OPTIMIZE TABLE can help in such a scenario. As I understand this operation will require a read lock on the entire table and optimizing the table might take quite a while on a big table.

What would be a good strategy to optimize this table while minimizing downtime?

EDIT The specific table to be optimized has +- 91M rows and looks like this:

Advertisement

Answer

Percona Toolkit’s pt-online-schema-change does this for you. In this case it worked very well.

User contributions licensed under: CC BY-SA
9 People found this is helpful
Advertisement