Node-postgres: how to avoid pg to parse a result object

Created on 1 Feb 2020  路  8Comments  路  Source: brianc/node-postgres

I'm using api-service as a proxy between PostgreSQL and client
so JSON.parse result from PostgreSQL and JSON.stringify the result to client is unnecessary and only eat service time.
Could I switch off parsing response from PostgreSQL into JSON?

question

Most helpful comment

It's about 5000 rows with about 1kb of JSON each. I didn't run the test with pg, I just ran explain analyze on the queries with and without ::text, and somehow the text conversion did slow it down a lot. (I wonder if it's just because Postgres doesn't normally count the serialization of the row data as part of the execution time?)

I eventually found a way to prevent parsing using the types property:

const types = require("pg").types;
res = await con.query({
  text: `SELECT values FROM table`,
  types: {
    getTypeParser(oid, ...rest) {
      if (oid === types.builtins.JSON || oid === types.builtins.JSONB) {
        return (v) => v;
      }
      return types.getTypeParser(oid, ...rest);
    },
  },
});

All 8 comments

You can pass a string as a parameter value where JSON is expected. That is,

const obj = JSON.parse(json);
pool.query('some query', [obj])

and

pool.query('some query', [json])

are equivalent when json is a well-formed JSON object.

I'm not speaking about query parameters but about query result

Cast it to text in the query:

SELECT some_json::text FROM somewhere

thanks !

I'm sorry for my pure english
thanks a lot for response!

Is there another way to prevent parsing? I've got a query with an execution time of 12ms, but with ::text on the jsonb field it takes 143ms.

Is there another way to prevent parsing? I've got a query with an execution time of 12ms, but with ::text on the jsonb field it takes 143ms.

how big is the jsonb text in bytes? If you can generate a self contained code sample of this being slow I can take a look, but it can be anything from the throughput between your DB and your app server to the size of the json, etc. If you're returning the results as a string then there's no parsing at all other than the mandatory parsing of turning the raw packets of row data into strings to return.

It's about 5000 rows with about 1kb of JSON each. I didn't run the test with pg, I just ran explain analyze on the queries with and without ::text, and somehow the text conversion did slow it down a lot. (I wonder if it's just because Postgres doesn't normally count the serialization of the row data as part of the execution time?)

I eventually found a way to prevent parsing using the types property:

const types = require("pg").types;
res = await con.query({
  text: `SELECT values FROM table`,
  types: {
    getTypeParser(oid, ...rest) {
      if (oid === types.builtins.JSON || oid === types.builtins.JSONB) {
        return (v) => v;
      }
      return types.getTypeParser(oid, ...rest);
    },
  },
});
Was this page helpful?
0 / 5 - 0 ratings

Related issues

AhmedBHameed picture AhmedBHameed  路  3Comments

ClueLessEggHead picture ClueLessEggHead  路  3Comments

joaquimknox picture joaquimknox  路  3Comments

gpanainte picture gpanainte  路  3Comments

gregallenvt picture gregallenvt  路  3Comments