I’m trying to create a constraint for an id (5 digits in length) where the 5th digit is the last digit of the value (1 x d1 + 3 x d2 + 1 x d3 + 3 x d4). For instance, if the first four characters of the id are 1234, then the fifth digit is the last digit of (1×1+3×2+1×3+3×4) = 22 resulting in the check digit of d5 = 2. Hence the id = 12342.
This is my code;
CREATE TABLE GameLicense_16 ( title TEXT, release_year INTEGER, platform TEXT, license_id INTEGER PRIMARY KEY CHECK (LENGTH(license_id) == 5) CHECK ((substr (license_id, 5, 1)) = substr (CAST ((substr (license_id, 1, 1) BETWEEN 0 AND 9) AS int) + 3* CAST ((substr (license_id, 2, 1) BETWEEN 0 AND 9) AS int) + 1* CAST ((substr (license_id, 3, 1) BETWEEN 0 AND 9) AS int) + 3* CAST ((substr (license_id, 4, 1) BETWEEN 0 AND 9) AS int),2,1)) );
But when I try and test it with the license id of 12342, it gives me the error message of constraint failed, which means that the calculation is wrong somewhere but I cannot figure it out.
Advertisement
Answer
Since you defined license_id
as INTEGER PRIMARY KEY
there is no need to check if the value that you insert is an integer by checking each of its characters with ... BETWEEN 0 AND 9
.
In fact this is the only case that SQLite forces type checking, so you can be sure that non integer values will be rejected.
Also in your code you are mixing boolean expressions and substring()
which do not make sense.
Focus only in the check constraint rules and it can be done like this:
CREATE TABLE GameLicense_16 ( title TEXT, release_year INTEGER, platform TEXT, license_id INTEGER PRIMARY KEY CHECK (LENGTH(license_id) == 5) CHECK ( SUBSTR(license_id, 5, 1) + 0 = ( SUBSTR(license_id, 1, 1) + 3 * SUBSTR(license_id, 2, 1) + SUBSTR(license_id, 3, 1) + 3 * SUBSTR(license_id, 4, 1) ) % 10 ) );
See the demo.