Terraform-provider-aws: How to create glue_job of type=sparc ?

Created on 4 May 2019  ·  5Comments  ·  Source: hashicorp/terraform-provider-aws

It looks like glue_job resource does not yet support setting type=sparc? Is this possible through default_arguments?

As per: https://docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-glue-arguments.html

I tried setting

default_arguments = {
"--enable-glue-datacatalog" = "true"
}

looking for some magic here or an update to this provider.

question

All 5 comments

Assuming you mean have terraform updates to have possible a type field for resource aws_glue_job to either specify Spark or Python shell? Because I'm running into this problem right now.

With etl-language set as "spark" then python shell by default should not be set which is causing issues in the glue console:
Glue Console Screen Shot

@itdataguy I found the solution. It was very minor detail but what you need to do is set the following in the command json:

command {
   name = "glueetl"
   script_location = "${var.script_location}"
}

Name has to be either glueetl or pythonshell. If its not one of those, by default it sets it to pythonshell

@rsgoshtasbi Thank you for this. I also just stumbled on this after resorting to terraform induces cloud formation. Thank you for sharing the secret sauce :)

In case terraform module is not up to the task, you can save this workaround.

resource "aws_cloudformation_stack" "network" {
  name = "${local.name}-glue-job"

  template_body = <<STACK
{
  "Resources" : {
    "MyJob": {
      "Type": "AWS::Glue::Job",
      "Properties": {
        "Command": {
          "Name": "glueetl",
          "ScriptLocation": "s3://${local.bucket_name}/jobs/${var.job}"
        },
        "ExecutionProperty": {
         "MaxConcurrentRuns": 2
        },
        "MaxRetries": 0,
        "Name": "${local.name}",
        "Role": "${var.role}"
      }
    }
  }
}
STACK
}

Awesome, thanks @itdataguy. Glad we figured out these solutions. Let's go ahead and close out this issue 🥇

I'm going to lock this issue because it has been closed for _30 days_ ⏳. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. Thanks!

Was this page helpful?
0 / 5 - 0 ratings