Describe the bug
When using the RDSDataService in combination with executeStatement columns of the type numeric are returned as stringValue.
Is the issue in the browser/Node.js?
Node.js
If on Node.js, are you running this on AWS Lambda?
Yes
Details of the Node.js version
10.x
SDK version number
2.603.0
To Reproduce (observed behavior)
Code of the Lambda function:
const { RDSDataService } = require("aws-sdk");
const RDS = new RDSDataService();
const { DB_CLUSTER_ARN, DB_CLUSTER_SECRET_ARN, DB_CLUSTER_DB_NAME, } = process.env;
async function handler() {
const databaseResponse = await RDS
.executeStatement({
resourceArn: DB_CLUSTER_ARN,
secretArn: DB_CLUSTER_SECRET_ARN,
database: DB_CLUSTER_DB_NAME,
sql: 'SELECT 1 :: numeric;',
})
.promise()
.catch(console.error);
if (!databaseResponse) {
console.error('Invalid DB request');
return { statusCode: 500 };
}
if (!databaseResponse.records || databaseResponse.records.length < 1) {
console.info('Response has no records');
return { statusCode: 204 };
}
return { statusCode: 200, body: databaseResponse.records }
}
exports.handler = handler;
Result:
{
"statusCode": 200,
"body": [
[
{
"stringValue": "1"
}
]
]
}
Expected behavior
I would expect to get either a longValue or doubleValue but at least not a stringValue. For me doubleValue would make more sense, since is can represent longValues as well.
{
"statusCode": 200,
"body": [
[
{
"longValue": 1,
// OR
"doubleValue": 1
}
]
]
}
Hey @bmacher thank you for reaching out to us with your issue.
Can you please share the body of the HttpResponse of your request.
Which you can print using console.log(this.httpResponse.body)
Thank you for taking care of my issue. Neither this.httpResponse.body nor databaseResoinse.httpResponse.body does exist.
Only thing I can get is:
console.log(databaseResponse.$response.httpResponse.body.toString('utf8'))
Result:
{
"numberOfRecordsUpdated": 0,
"records": [
[
{
"stringValue": "1"
}
]
]
}
It is the response from the service, SDK only fetches it, I'll reach out to them to know more, will update you once I hear back from them.
I just reported this as a bug via AWS Business Support. I'll update this issue when I hear back.
Bleh. Apparently it's expected behavior:
https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/data-api.html#data-api.calling
Even DECIMAL type is treated as a string, and since NUMERIC falls under "other", it's a string too. That's disappointing.
So people using the Data API will have to do some transformations to look for strings that are actually numeric, and convert them (if it matters to you).
Quoting AWS support below:
This table does not include NUMERIC data type. But “Other types” are mapped to STRING data type in Data API. The document also states that, for some specific types, such as DECIMAL or TIME, a hint might be required to instruct the Data API that the String value should be passed to the database as a different type. You can do this by including values in typeHint in the SqlParameter data type.
To further investigate this, I reproduced this issue on my environment, I created an Aurora PostgreSQL cluster and received the following results for my queries:
- --sql "SELECT 1 :: numeric;"
"records": [
[ { "stringValue": "1"
}]- --sql "SELECT 1 :: decimal;"
"records": [
[ { "stringValue": "1"
}]- --sql "SELECT 1 :: float;"
"records": [
[ { "doubleValue": "1"
}]
Therefore, the output is exactly as listed in the Data API documentation and thus we can confirm that this is an expected behavior [1].
Additionally, I have placed a feature request to the internal team to support “numeric” data type apart from the other supported datatypes. Unfortunately, I do not have an ETA on when will this be supported. Please note that we at AWS, are continuously working on improving the feature and services, and this may be one of the datatypes that will be mapped in future to provide better customer experience.
I'm not a Postgres wiz, so I'll have to look into maybe using float, but I need to be able to set the max precision so I don't wind up with numbers with a hundred decimal places. 😬
Hi @ffxsam thanks for reaching out. I guess I can close this issue then.