Skip to content
Advertisement

How to define self-defined log in SQL Server

I am now dealing with a huge table that contains XML each row.

My job is to write sp to parse the XML and insert its data to the other corresponding tables row by row.

But there is a problem that when the sp interrupts, partial data is inserted into the new table while the others are not.

So, I need to delete the dirty data to recover.

Normally the solution is to use transaction, right?

But it really slows down the whole process to an unacceptable level.

My thought is that, when handling a new row of XML, its Id is output to a log.

If the sp breaks down and restarts, it checks the log, finds the last Id, and check if the Id is completed(there is a flag column).

If not, it will delete all the info of that row, and begin handling the row again.

So my question is that, in SQL server, is there any way in SQL Server to implement the log function, or is there any other solution that can help me realize my goal?

Advertisement

Answer

You can add new column (temporary) in your table to check if row is migrated. For example:

Then in your transaction block:

you can select only 50-100-500 rows from your original table. Then, migrate the data. Finally, update there IsMigrated flag.

You can SELECT the rows like the following:

Of course the whole thing can be in loop and you can use WAITFORDELAY in order to ensure other processes are able to do some work.

Also, if the original table is changed, you can add a trigger to reset the IsMigrated flag.

User contributions licensed under: CC BY-SA
9 People found this is helpful
Advertisement