Is your feature request related to a problem? Please describe.
I've created a graphql api backend and declared tables in my schema using @model. I would like to create a function to populate a field on the table on insert. Attempting this caused the following error:
There are no DynamoDB resources configured in your project currently
Describe the solution you'd like
I'd like amplify to recognize the dynamoDB tables creates via the graphql schema so that I can add lambda function "triggers".
Describe alternatives you've considered
I'll probably do the work on the front end for now.
Indeed, came across that problem too. Current workaround is to trigger a GraphQL mutation from Lambda, but that's a bit weird.
https://read.acloud.guru/backend-graphql-how-to-trigger-an-aws-appsync-mutation-from-aws-lambda-eda13ebc96c3
@mikeparisstuff @kaustavghosh06 @UnleashedMind
Question:
I would second this request.. the ability to configure a lambda to trigger based on Dynamo or Cognito or even SQS.
Steps to reproduce:
> amplify function add
Using service: Lambda, provided by: awscloudformation
? Provide a friendly name for your resource to be used as a label for this category in the project: myfunction
? Provide the AWS Lambda function name: myfunction
? Choose the function template that you want to use: CRUD function for Amazon DynamoDB table (Integra
tion with Amazon API Gateway and Amazon DynamoDB)
? Choose a DynamoDB data source option Use DynamoDB table configured in the current Amplify project
There are no DynamoDB resources configured in your project currently
Have configured API backed by DynamoDB.
Also have this issue. It seems like creating a Lambda function doesn't recognize the DynamoDB backend created when using amplify add api
and you choose graphql
.
Update:
If the DynamoDB created by amplify add api
would be displayed in amplify-meta.json
you could access it via the environment variables in the CloudFormation template (similar to how env
and <you-appsync-api>GraphQLAPIIdOutput
is accessed in "Parameters"
).
Until that is the case or until you can choose your existing DynamoDB in the Amplify prompts, here is a manual workaround:
Add you parameters (note AVOID COMMITTING THESE TO GIT, or use fake):
"Parameters": {
"env": {
"Type": "String"
},
"storagetododynamoName": {
"Type": "String",
"Default": "<your-db-name>"
},
"storagetododynamoArn": {
"Type": "String",
"Default": "<your-db-arn>"
}
},
Append the policies manually which would be automatically generated for you:
"lambdaexecutionpolicy": {
"DependsOn": ["LambdaExecutionRole"],
"Type": "AWS::IAM::Policy",
"Properties": {
"PolicyName": "lambda-execution-policy",
"Roles": [{ "Ref": "LambdaExecutionRole" }],
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": {
"Fn::Sub": [
"arn:aws:logs:${region}:${account}:log-group:/aws/lambda/${lambda}:log-stream:*",
{
"region": { "Ref": "AWS::Region" },
"account": { "Ref": "AWS::AccountId" },
"lambda": { "Ref": "LambdaFunction" }
}
]
}
},
{
"Effect": "Allow",
"Action": [
"dynamodb:GetItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:DeleteItem"
],
"Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
}
]
}
}
},
"AmplifyResourcesPolicy": {
"DependsOn": ["LambdaExecutionRole"],
"Type": "AWS::IAM::Policy",
"Properties": {
"PolicyName": "amplify-lambda-execution-policy",
"Roles": [{ "Ref": "LambdaExecutionRole" }],
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"dynamodb:Put*",
"dynamodb:Create*",
"dynamodb:BatchWriteItem",
"dynamodb:Get*",
"dynamodb:BatchGetItem",
"dynamodb:List*",
"dynamodb:Describe*",
"dynamodb:Scan",
"dynamodb:Query",
"dynamodb:Update*",
"dynamodb:RestoreTable*",
"dynamodb:Delete*"
],
"Resource": [{ "Ref": "storage<your-db-name>dynamoArn" }]
}
]
}
}
}
If you used fake parameters, visit your Lambda function in your Lambda console and add the environment variables (storage<your-db-name>dynamoName
and storage<your-db-name>dynamoArn
) _after_ you pushed the function (it will overwrite any existing variables). That way you can use you DynamoDB as if you had chosen CRUD function for Amazon DynamoDB table (Integration with Amazon API Gateway and Amazon DynamoDB)
in amplify add auth
.
Any updates to this? It seems that the tutorial in the announcement is using two tables - one from the graphql.schema
with @model
directive and one from the storage
Any updates? there is currently no way of creating custom resolvers with lambda that can connect to dynamodb as far as I'm aware.
looking forward to this
I think you can add the function then go to the DynamoDB table, go to "Triggers" and click "Create trigger" and add the trigger manually to that existing function.
Hi,
Scraped the net for a solution that works from Amplify but finds very little (nothing), is this solved? I use the latest Amplify updated yesterday and I cannot still add a trigger for my function using amplify add function. My DDB was created with add api so suffering the same problems as above.
"There are no DynamoDB resources configured in your project currently"
Is there a valid workaround for Amplify? Any blog post that explains in more detail?
thanks
@Buder what I ended up doing is creating a function that accepts a ___TABLE_NAME
environment variable and just manually set it in the AWS Lambda console and if you create a new API or change the table names you will need to go to that lambda's console and set it manually again.
@idanlo
Ok, so you created a standard lambda? Like amplify add function "Hello world" option? And then what makes it accept a ___TABLE_NAME??
And don't you have to enable stream/trigger on the table and add some lambda ID(ARN) to that trigger? And how do we set the IAM roles permission then?
Not expecting you to help me with all this, but still valid questions.. :)
@Buder Yes you create the function from the cli using amplify add function
, and to give it the environment variable you go to the that lambda's page using the AWS Lambda console and when you scroll down you can see a list of environment variable and there you can add your own, so for example you can add a USER_TABLE_NAME
variable and then in the code you can access it to perform actions on a DynamoDB table.
If you want the function to only have access to that table only, you can scroll down in the lambda page and go to edit that function's IAM role, there you can edit the policy and add the policies that you need, for example dynamodb PutItem method, and you give it the table ARN so that it can only access that table and not other tables.
@idanlo
Ok, I see, yes I can do this but it requires me to create some polling sequence to react to the INSERT of a new table item, what I am looking for is a trigger so that the lambda executers when a INSERT is executed in the table. As far as I understand your solution do not handle that.
On the other hand I think that it is possible to associate the hello world lambda with the table trigger from DynamoDB by setting some data there and enabling stream for instance.
@Buder This is actually something I am looking into right now, you can add a trigger through the DynamoDB console and create a lambda function or use an existing lambda function for that trigger, choosing the create a new function didn't work for me so I created one manually and I needed to add the correct policies to that function's role so that it will work with the dynamo trigger (dynamo will tell you what roles to add).
@idanlo
Ok, let me know your findings here, valuable for all. I will try myself this weekend to set up a lambda and try to trigger it from the table insert. Do you know if this will work also when invoking the lambda with the amplify command (running the lambda locally I guess?)
I will post if I get it working in some way.
I guess it will work but it won't pass the parameters that the trigger is passing, instead you can just change/create some dummy data in the dynamodb table and then change it back and it should invoke the trigger
It would be nice if we could have an @trigger
directive that works like the @function
directive.
A pull request resolving this has been added: https://github.com/aws-amplify/amplify-cli/pull/2463
@aireater:
this doesn't add a @trigger
transformer through, that could be another feature request. But maybe keeping anything unrelated with graphql API away from transformers would be an easier to manage and slightly more flexible.
All I needed to do was add the following Policy, I already had everything else needed in the function's cloud formation template:
"someTableTrigger": {
"Type": "AWS::Lambda::EventSourceMapping",
"DependsOn": [
"AmplifyResourcesPolicy"
],
"Properties": {
"BatchSize": 1,
"Enabled": true,
"EventSourceArn": {
"Ref": "someTableStreamArn"
},
"FunctionName": {
"Fn::GetAtt": [
"LambdaFunction",
"Arn"
]
},
"StartingPosition": "LATEST"
}
}
And I added "someTableStreamArn" as a parameter for the template.
Any updates on this?
Hey guys, we released this functionality in the latest version of our CLI - 4.16.1. This was merged as a part of this PR - #2463
Is it possible to make an existing function trigger from DynamoDB withamplify function update
, or can I only do that when I create the function?
@mrgrue did you find any solution for you problem?. I'm having the same situation.
@vrebo Not really. I ended up just deleting the function and recreating it and making sure to set it up as a trigger at creation time.
Yeah not a big deal. Move your function to like functions folder name,
recreate function throug cli, then copy the files from function2 folder
back over top of those in function.
On Wed, Nov 25, 2020, 2:47 PM Mr. Grue notifications@github.com wrote:
@vrebo https://github.com/vrebo Not really. I ended up just deleting
the function and recreating it and making sure to set it up as a trigger at
creation time.—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/aws-amplify/amplify-cli/issues/997#issuecomment-733914726,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ABUMX3E3V3WWXEDC5QJNDPDSRVNM7ANCNFSM4G4WKZGA
.
Is it possible to make an existing function trigger from DynamoDB with
amplify function update
, or can I only do that when I create the function?
can someone please create a separate feature request for this? Much appreciate!
I spent close to half an hour fumbling to find a way to disable an existing trigger.
Most helpful comment
Also have this issue. It seems like creating a Lambda function doesn't recognize the DynamoDB backend created when using
amplify add api
and you choosegraphql
.