Skip to content
Advertisement

Measuring the final timestamp of multiple work items vs one work item in SQL

I have recently taken over SQL query responsibilities from a colleague who has left.

My question is as follows:
I have 4 work items (A, B, C, D). I have a base population of several thousand cases in which work item D was always the last event. However, in this population, the 2nd to last work item could be A, B, or C. I need to measure, on average, the time between the end of work item A, B, or C and the end of work item item D. What would this code look like?

Supplementary information: The pertinent tables I would join:

  1. Database.workitems (identifies work item a b c d)

  2. Database.workitemstatus (show possible statuses – open, in progress, complete – of work items)

  3. Database.workitemsstatustime (gives the timestamp for work item status inserts)

So I think my FROM statement would be

So I think I have what I need to start. The base population established every case number ended in work item D. Now I must measure the average time between work item status = complete for any other work item (A,B,C) and work item status = complete for work item D.

Advertisement

Answer

Plenty of guessing going on here . . . workitemstatustime seems redundant? You’ll need to provide columns of the tables for us to understand. untested but maybe this will help you figure something out.

User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement