I discovered a weird issue in my database, I was able to fix it, but I do not understand WHY the error occurred in the first place. I’m using Microsoft SQL Server 2017.
The following code returns an Arithmetic Overflow error:
SELECT '1000' / 100.0 FROM table_name
SELECT '1000.0' / 100.0 FROM table_name
Returns Error:
Arithmetic overflow error converting varchar to data type numeric.
But what is weird is that the following code DOES NOT cause an error:
SELECT '100' / 100.0 FROM table_name
Returns: 1.000000 for each row.
SELECT '999' / 100.0 FROM table_name
Returns: 9.990000 for each row.
SELECT '100.0' / 100.0 FROM table_name
Returns: 1.000000 for each row.
SELECT '1000' / 100 FROM table_name
Returns: 10 for each row.
I have since fixed the code so that it is using a convert before attempting to do arithmetic, but what bothers me is WHY did the code work without converting numbers less than 1000???? This is really bugging me!
Advertisement
Answer
You have 2 values here:
'1000'
which is avarchar(4)
100.0
which is adecimal(4,1)
As a result when you perform the expression '1000' / 100.0
the varchar
is implicitly cast to a decimal
as decimal
has a higher data type precedence. As, however, the largest value a decimal(4,1)
can store is 999.9
, then the value 1000
overflows and you get an error.