Skip to content
Advertisement

Quickest way to import a large (50gb) csv file into azure database

I’ve just consolidated 100 csv.files into a single monster file with a total size of about 50gb.

I now need to load this into my azure database. Given that I have already created my table in the database what would be the quickest method for me to get this single file into the table?

The methods I’ve read about include: Import Flat File, blob storage/data factory, BCP.

I’m looking for the quickest method that someone can recommend please?

Advertisement

Answer

Azure data factory should be a good fit for this scenario as it is built to process and transform data without worrying about the scale.

Assuming that you have the large csv file stored somewhere on the disk you do not want to move it to any external storage (to save time and cost) – it would be better if you simply create a self integration runtime pointing to your machine hosting your csv file and create linked service in ADF to read the file. Once that is done, simply ingest the file and point it to the sink which is your SQL Azure database.

https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system

User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement