Node-postgres: JSONB arguments not treated consistently

Created on 26 Nov 2019  路  11Comments  路  Source: brianc/node-postgres

According to docs, there is no special treatment required for JSONB bind arguments. In practice there is:

  • When the argument is a single object, it is bound correctly
  • When the argument is an Array, it is necessary to call JSON.stringify on the args.

Here's a test case that reproduces the issue:

const { Client } = require('pg');
const { inspect } = require('util');
async function doIt() {
    const client = new Client();
    console.log('Connecting...');
    await client.connect();
    console.log('Connected');
    const a = [
        {
            the: 'quick',
            brown: 'fox',
            jumps: 'over the',
            lazy: 'dog',
            some: { other: 'object', dilroy: ['hello', 'world'], numpty: new Date() }
        }
    ];
    console.log('Inserting...');
    const result = await client.query(`insert into sb_json(c1) values ($1) returning *`, [a]);
    console.log(`result = ${inspect(result, { depth: null })}`);
    console.log('Disconnecting...');
    await client.end();
}
doIt()
    .then(() => console.log('All done'))
    .catch(e => console.log(inspect(e)));

Current documented contract is that no special treatment is required.

Possible solution:

  • Document current behavior. This is the less ideal solution, as arguably it is better to be consistent. Never call JSON.stringify on args, or always do so. Not one case for arrays, and on case for single items.
  • Inspect the contents of the array. If it contains all primitive types insert for that type. Otherwise call JSON.stringify internally. This operation would run in O(n) time - for a very large array it would be slow.

Most helpful comment

Because we don't use pg arrays, we've implemented following workaround:

// @ts-ignore
import pgUtils from 'pg/lib/utils.js'

// a workaround to force storage of array as json until
// https://github.com/brianc/node-postgres/issues/2012
// is fixed
const originalPrepareValue = pgUtils.prepareValue
pgUtils.prepareValue = <T>(val: T): T | string => {
    if (Array.isArray(val)) {
        return JSON.stringify(val)
    }
    return originalPrepareValue(val)
}

Maybe it helps someone.

All 11 comments

Likewise sending an object to the PostgreSQL server via a query from node-postgres, node-posgres will call JSON.stringify on your outbound value, automatically converting it to json for the server.

With the existence of jsonb[] and without knowing the desired types ahead of time, no solution by inspecting JavaScript types can work.

See also #442

Yeah, I see what you're saying. It's a bit funky that node-pg assumes to stringify Objects but not Arrays. I think making it more consistent, or rather, no incoming magic (I like the magic coming out though!) would be for the best. It seems a little arbitrary, especially now that JSON has been added to the data-type mix, that JS arrays are assumed to be PG Arrays, when they could be type json or even hstore.
My preference would be to make no assumptions and leave it up to the library consumer. I'm not sure if that means node-pg should JSON.stringify Arrays and Objects by default, as that would be trading one magic for another. But it certainly does seem like a sane default.

^-- This comment from the linked ticket (#442) resonates with me. Better to be consistent in behavior.

  • Never call JSON stringify.
  • Always call it. (This appears to be impossible).

https://node-postgres.com/features/types
would be a good place to mention the current behaviour as it seems to be mentioned only in github issues right now

@jasperblues @charmander we are not intended to use pg arrays. Is there any way to say "treat javascript arrays always as pg json"? We use knex if it's matter.

I see the comment, probably from @brianc , but I don't see the way how to replace the array handling: https://github.com/brianc/node-postgres/blob/master/packages/pg/lib/utils.js#L47

Looked for this answer for so long.
If not fixable (not sure I understood the whole issue), this should at least be noted as a warning in the doc 馃憤

As a workaround you could 'wrap' your arrays in an object and turn that into the JSONB text you want:

class JsonBArrayParam {
  constructor(actualArray) { this.actualArray = actualArray }
  toPostgres() {
    return JSON.stringify(this.actualArray)
  }
}

client.query('SELECT FOO WHERE Bar = $1', [new JSonBArrayWrapper(['one', 'two'])

or something similar to that. does that work?

Because we don't use pg arrays, we've implemented following workaround:

// @ts-ignore
import pgUtils from 'pg/lib/utils.js'

// a workaround to force storage of array as json until
// https://github.com/brianc/node-postgres/issues/2012
// is fixed
const originalPrepareValue = pgUtils.prepareValue
pgUtils.prepareValue = <T>(val: T): T | string => {
    if (Array.isArray(val)) {
        return JSON.stringify(val)
    }
    return originalPrepareValue(val)
}

Maybe it helps someone.

Just hit this too. Ugly. Can we provide some kind of bind directive to tell PG how to interpret the array, or what the required behavior for a given insert statement is.

If the library would expose the types parameter for prepared statements, there might be a way to do conditional serialization based on the type.

The API could look something like this:

pool.query({
  text: 'SELECT $1, $2, $3', 
  values: ['1', 2, [1, 2, 3]],
  paramTypes: [null, null, OID.JSONB]
);

I'm not sure if it's any better then wrapping the values in custom serialized type as suggested in https://github.com/brianc/node-postgres/issues/2012#issuecomment-607415271.

Was this page helpful?
0 / 5 - 0 ratings