Skip to content
Advertisement

How to pass an array of JSON (or JSON array) to pg function to be INSERTed into a Table?

Function to insert rows of json array into a table:

create table mytable(col1 text, col2 boolean, col3 boolean);
create function fun1(vja json[])
... as $$
begin
  foreach v in array json_array_elements(vja)
  loop
    insert into mytable(col1, col2, col3)
    values(v->'col1', v->'col2'::boolean, v->'col3'::boolean);
  end loop;
end;
$$;

Call this function:

select fun1('[
    {"col1": "turow1@af.com", "col2": false, "col3": true}, 
    {"col1": "xy2@af.com", "col2": false, "col3": true} 
    ]')

or this form:

select fun1('[
    {"col1": "turow1@af.com", "col2": "false", "col3": "true"}, 
    {"col1": "xy2@af.com", "col2": "false", "col3": "true"}, 
    ]')

or this form:

select fun1('[
    {"col1": "turow1@af.com", "col2": "false", "col3": "true"}, 
    {"col1": "xy2@af.com", "col2": "false", "col3": "true"}, 
    ]'::json[])

Always received:

ERROR:  malformed array literal: "[
  {"col1": "turow1@af.com", "col2": "false", "col3": "true"}, 
  {"col1": "xy2@af.com", "col2": "false", "col3": "true"}, 
  ]"
LINE 2:  '[
         ^
DETAIL:  "[" must introduce explicitly-specified array dimensions.
SQL state: 22P02
Character: 136

Advertisement

Answer

A JSON array (json) is different from a Postgres array of JSON values (json[]).

SELECT '[{"foo": "bar"}, {"foo1": "bar1"}]'::json;  -- JSON array

vs:

SELECT '{"{"foo": "bar"}","{"foo1": "bar1"}"}'::json[]  -- array of JSON

The first is an array nested inside a single JSON value, the second is an array of JSON values.

Postgres array of JSON (json[])

Your (fixed!) function:

CREATE OR REPLACE FUNCTION fun1(vja json[])
  RETURNS void
  LANGUAGE plpgsql AS
$func$
DECLARE
   v json;
BEGIN
   FOREACH v IN ARRAY vja
   LOOP
      INSERT INTO mytable(col1, col2, col3)
      VALUES(v ->> 'col1', (v ->> 'col2')::bool, (v ->> 'col3')::bool);
   END LOOP;
END
$func$;

Expects a call like this (note all the escaping for the json[] literal):

SELECT fun1('{"{"col1": "turow1@af.com", "col2": false, "col3": true}","{"col1": "xy2@af.com", "col2": false, "col3": true}"}'::json[]);

See:

But a single INSERT with json_populate_record() in a procedure beats looping in a function:

CREATE OR REPLACE PROCEDURE proc1(vja json[])
  LANGUAGE sql AS
$proc$
INSERT INTO mytable  -- target column list redundant in this particular case
SELECT r.*
FROM   unnest(vja) v, json_populate_record(NULL::mytable, v) r
$proc$;

See:

Or simpler with the standard SQL variant in Postgres 14 or later:

CREATE OR REPLACE PROCEDURE proc1(vja json[])
BEGIN ATOMIC
INSERT INTO mytable
SELECT r.*
FROM   unnest(vja) v, json_populate_record(NULL::mytable, v) r;
END;

See:

Call (!):

CALL proc1('{"{"col1": "turow1@af.com", "col2": false, "col3": true}","{"col1": "xy2@af.com", "col2": false, "col3": true}"}'::json[]);

db<>fiddle here

JSON array (json)

Typically, you want to pass a JSON array like you tried. So, now with json_populate_recordset():

CREATE OR REPLACE PROCEDURE proc2(vja json)
  LANGUAGE sql AS
$proc$
INSERT INTO mytable
SELECT * FROM json_populate_recordset(NULL::mytable, vja);
$proc$;

Or (Postgres 14):

CREATE OR REPLACE PROCEDURE proc2(vja json)
BEGIN ATOMIC
INSERT INTO mytable
SELECT * FROM json_populate_recordset(NULL::mytable, vja);
END;

Call (now you can use the value you originally tested with!):

CALL proc2('[
    {"col1": "turow1@af.com", "col2": false, "col3": true}, 
    {"col1": "xy2@af.com", "col2": false, "col3": true} 
    ]');

db<>fiddle here

User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement