Skip to content
Advertisement

MySQL bulk insert on multiple tables

I have a MySQL database with 2 tables products and product_variants. A product has many product variants. Here a sample:

Now I need to bulk insert a lot of products with their variants in the most efficient and fastest way. I have a JSON with many products (100k+) like this:

from which I should generate a query to insert the products.

My idea is to use an insert query like this:

But then I don’t know what product_id (foreign key) to use in the insert query for product_variants:

(these queries inside a transaction)

I’ve thought to specify the product ids manually, in increment way from the last id, but I get errors when concurrency connection insert products at the same time or when 2 or more bulk insert processes run concurrently.

What strategy can I use to achieve my goal? Is there a standard way to do this?

ps: if possible I would not want to change the structure of the 2 tables.

Advertisement

Answer

Finally, I’ve used a strategy that uses the MySQL function LAST_INSERT_ID() like @sticky-bit sad but using bulk insert (1 insert for many products) that is much faster.

I attach a simple Ruby script to perform bulk insertions. All seems works well also with concurrency insertions.

I’ve run the script with the flag innodb_autoinc_lock_mode = 2 and all seems good, but I don’t know if is necessary to set the flag to 1:

After running the script I’ve checked that all products have associated variants with the query

and correctly I get no rows.

User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement