Amplify-cli: Custom pipeline resolver

Created on 14 Apr 2019  路  7Comments  路  Source: aws-amplify/amplify-cli

* Which Category is your question related to? *
API

* What AWS Services are you utilizing? *
AppSync

* Provide additional details e.g. code snippets *
In doc https://aws-amplify.github.io/docs/cli/graphql#add-a-custom-resolver-that-targets-a-dynamodb-table-from-model, users can write the resolver in VTL and version control it under resolvers/ folder. What if I want to implement a custom pipeline resolver? What do I have to do differently from the doc?

Note I know how to add a pipeline resolver in AppSync console. In the question, I care only about a "version-controlled" solution.

graphql-transformer pending-response question

Most helpful comment

@Ricardo1980 incase you haven't cracked the code yet for your question about multiple functions sending queries in a single pipeline, what you would do is:

  1. If you haven't already, create your AppSync data source pointing at Aurora, with a suitable IAM role
  2. For each function you want to have in your pipeline (where a function does something like fire off a query to Aurora), define your function as shown in Step 2 of mikeparisstuff's answer above
  3. Add each function you want to be part of the pipeline. List it in the PipelineConfig.Functions part of where you defined your Pipeline. See Step 3 of mikeparisstuff's answer above. So for example, if you have 2 functions, one to do a select statement and the other to insert something, the Functions part of your pipeline config would look something like:
    "PipelineConfig": {
      "Functions": [
        {
          "Fn::GetAtt": ["MyCustomSelectFunction", "FunctionId"]
        },
        {
          "Fn::GetAtt": ["MyFunctionToInsertRecords", "FunctionId"]
        }
      ]
    }

All 7 comments

  1. Follow the same instructions for create a data source.

  2. Create functions.

"EchoFunction": {
  "Type": "AWS::AppSync::FunctionConfiguration",
  "Properties": {
    "ApiId": {
      "Ref": "AppSyncApiId"
    },
    "Name": "EchoFunction",
    "DataSourceName": {
      "Fn::GetAtt": [
        "EchoLambdaDataSource",
        "Name"
      ]
    },
    "RequestMappingTemplateS3Location": {
      "Fn::Sub": [
        "s3://${S3DeploymentBucket}/${S3DeploymentRootKey}/resolvers/MyFunction.req.vtl",
        {
          "S3DeploymentBucket": {
            "Ref": "S3DeploymentBucket"
          },
          "S3DeploymentRootKey": {
            "Ref": "S3DeploymentRootKey"
          }
        }
      ]
    },
    "ResponseMappingTemplateS3Location": {
      "Fn::Sub": [
        "s3://${S3DeploymentBucket}/${S3DeploymentRootKey}/resolvers/MyFunction.res.vtl",
        {
          "S3DeploymentBucket": {
            "Ref": "S3DeploymentBucket"
          },
          "S3DeploymentRootKey": {
            "Ref": "S3DeploymentRootKey"
          }
        }
      ]
    }
  }
}
  1. Create a pipeline resolver:
"QueryEchoResolver": {
  "Type": "AWS::AppSync::Resolver",
  "Properties": {
    "ApiId": {
      "Ref": "AppSyncApiId"
    },
    "DataSourceName": {
      "Fn::GetAtt": [
        "EchoLambdaDataSource",
        "Name"
      ]
    },
    "Kind": "PIPELINE",
    "TypeName": "Query",
    "FieldName": "echo",
    "RequestMappingTemplate": "# any pipeline setup goes here \n{}",
    "ResponseMappingTemplate": "$util.toJson($ctx.prev.result)",
    "PipelineConfig": {
      "Functions": [
        {
          "Fn.GetAtt": ["EchoFunction", "FunctionId"]
        }
      ]
    }
}

@mikeparisstuff In my existing pipeline resolver, there is no lambda function involved. From your solution, I need some lambda functions. Why? Can I do it without using lambda functions?

The solution by @mikeparisstuff worked with following notes:

  1. "FunctionVersion": "2018-05-29" is a required property in EchoFunction.Properties, 2018-05-29 is the only version that is supported.
  2. Fn.GetAtt in pipeline resolver is actually spelt Fn::GetAtt

Side note: There is no need for a lambda data source, any data source e.g. from dynamodb can be used, also RequestMappingTemplateS3Location can be used in place of RequestMappingTemplate, empty request template should at least have a {}

Hello @mikeparisstuff
Thanks for your sample code.
What about if I want 2 resolvers that use Aurora to send SQL queries?
How should I setup this pipeline resolver?
In your QueryEchoResolver file, I don't see you are calling 2 resolvers.
Thanks for help.

@Ricardo1980 incase you haven't cracked the code yet for your question about multiple functions sending queries in a single pipeline, what you would do is:

  1. If you haven't already, create your AppSync data source pointing at Aurora, with a suitable IAM role
  2. For each function you want to have in your pipeline (where a function does something like fire off a query to Aurora), define your function as shown in Step 2 of mikeparisstuff's answer above
  3. Add each function you want to be part of the pipeline. List it in the PipelineConfig.Functions part of where you defined your Pipeline. See Step 3 of mikeparisstuff's answer above. So for example, if you have 2 functions, one to do a select statement and the other to insert something, the Functions part of your pipeline config would look something like:
    "PipelineConfig": {
      "Functions": [
        {
          "Fn::GetAtt": ["MyCustomSelectFunction", "FunctionId"]
        },
        {
          "Fn::GetAtt": ["MyFunctionToInsertRecords", "FunctionId"]
        }
      ]
    }

Thanks @bogan27
Very useful!

Was this page helpful?
0 / 5 - 0 ratings