Assuming I have this schema:
create table rental ( id integer, rental_date timestamp, customer_id smallint, return_date timestamp, );
Running this query returns strange results:
select customer_id, avg(return_date - rental_date) as "avg" from rental group by customer_id order by "avg" DESC
It displays:
customer_id|avg_rent_duration | -----------|----------------------| 315| 6 days 14:13:22.5| 187|5 days 34:58:38.571428| 321|5 days 32:56:32.727273| 539|5 days 31:39:57.272727| 436| 5 days 31:09:46| 532|5 days 30:59:34.838709| 427| 5 days 29:27:05| 555|5 days 26:48:35.294118| ... 599 rows
Why is there values like 5 days 34:58:38
, 5 days 32:56:32
and so on? I thought there where only 24 hours in a day, maybe I’m wrong.
EDIT
Demo here: http://sqlfiddle.com/#!17/caa7a/1/0
Sample data:
insert into rental (rental_date, customer_id, return_date) values ('2007-01-02 13:10:06', 1, '2007-01-03 01:01:01'), ('2007-01-02 01:01:01', 1, '2007-01-09 15:10:06'), ('2007-01-10 22:10:06', 1, '2007-01-11 01:01:01'), ('2007-01-30 01:01:01', 1, '2007-02-03 22:10:06');
Advertisement
Answer
You have to adjust interval with justify_interval()
function:
select customer_id, justify_interval(avg(return_date - rental_date)) as "avg" from rental group by customer_id order by "avg" DESC;
See official doc:
Adjust interval using
justify_days
andjustify_hours
, with additional sign adjustments
Still, it does not explains why the result of the operation is that weird without using justify_interval()
(in other words, why we have to apply this function)
Note: thanks to @a_horse_with_no_name for their comment