modular-magician user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If an issue is assigned to a user, that user is claiming responsibility for the issue. If an issue is assigned to hashibot, a community member has claimed the issue already.Terraform v0.12.29
provider.google v3.32.0
provider.null v2.1.2
provider "google" {
project = "hos-dev"
credentials = file("terraform-credentials.json")
region = "us-west2"
zone = "us-west2-a"
}
variable "environment" {}
variable "project" {}
resource "google_bigquery_dataset" "hos_omi_audits" {
dataset_id = "hos_omi_audits"
friendly_name = "my_app omi audits"
description = "My App HOS audit dataset"
location = "US"
labels = {
env = "default"
}
}
resource "google_bigquery_table" "audit_omi_eld_events" {
dataset_id = google_bigquery_dataset.hos_omi_audits.dataset_id
table_id = "audit_omi_eld_events"
time_partitioning {
type = "DAY"
expiration_ms = 1209600000 # 14 days
}
labels = {
env = "default"
}
schema = <<EOF
[
{
"name": "request_id",
"type": "STRING",
"mode": "REQUIRED",
"description": "request id"
},
{
"name": "account",
"type": "STRING",
"mode": "REQUIRED",
"description": "account name"
},
{
"name": "return_ts",
"type": "INT64",
"mode": "REQUIRED",
"description": "return timestamp"
}
]
EOF
}
resource "google_dataflow_job" "hos_omi_audit_dataflow_job" {
name = "hos-omi-audit-dataflow-job"
template_gcs_path = "gs://dataflow-templates-us-central1/latest/PubSub_Subscription_to_BigQuery"
temp_gcs_location = "gs://hos-omi-audit-dataflow-job/temp"
zone = "us-west1-a"
labels = {
team = "hos"
app = "my_app"
env = "${var.environment}"
}
}
resource "google_pubsub_topic" "hos_omi_audit_topic" {
name = "hos-omi-audit-topic"
labels = {
team = "hos"
app = "my_app"
env = "${var.environment}"
}
}
https://gist.github.com/andrewhou-zonar/3a84f78df10dfd013971c3a9eea11e87
I should be able to set a zone for my google_dataflow_job of us-west1-a, according to the the documentation here: https://www.terraform.io/docs/providers/google/r/dataflow_job.html#zone
Error: googleapi: Error 400: The template parameters are invalid., badRequest.
If I set zone = "us-west1", then it gives me this error message (noteus-wes`, truncating the last two characters):
Error: googleapi: Error 400: (e94bb5b4cd38ba88): The workflow could not be created, since it was sent to an invalid regional endpoint (us-wes). Please resubmit to a valid Cloud Dataflow regional endpoint. The list of Cloud Dataflow regional endpoints is at https://cloud.google.com/dataflow/docs/concepts/regional-endpoints., failedPrecondition
If I don't set a zone at all, then it gives the following error as expected, since GCP Dataflow does not have a regional endpoint in my default provider region of us-west2:
Error: googleapi: Error 400: (d825b0c408746c8f): The workflow could not be created, since it was sent to an invalid regional endpoint (us-west2). Please resubmit to a valid Cloud Dataflow regional endpoint. The list of Cloud Dataflow regional endpoints is at https://cloud.google.com/dataflow/docs/concepts/regional-endpoints., failedPrecondition
terraform apply@andrewhou-zonar as it complains the template, I am not sure what template do you have. Are you able to test the template using other tools like gcloud? Is there a way I can use your template?
https://github.com/GoogleCloudPlatform/DataflowTemplates
@edwardmedia Thanks for the response. I'm using this default template from GCP:
I've successfully used the template by click-button creating the dataflow job in the GCP console, but would like to terraform it now.
Yeah I see the issue. Thank you, @andrewhou-zonar, for reporting this.
Here is what I got. Three fields are incorrect
region ---> zone
temp_gcs_location ---> tempLocation
template_gcs_path ---> gcsPath
# google_dataflow_job.hos_omi_audit_dataflow_job will be created
+ resource "google_dataflow_job" "hos_omi_audit_dataflow_job" {
+ id = (known after apply)
+ job_id = (known after apply)
+ labels = {
+ "app" = "my_app"
+ "env" = "issue6961"
+ "team" = "hos"
}
+ name = "ssue6961hos-terraform-job"
+ on_delete = "drain"
+ project = (known after apply)
+ region = "us-central1"
+ state = (known after apply)
+ temp_gcs_location = "gs://coolbucket-dev/temp"
+ template_gcs_path = "gs://dataflow-templates-us-central1/latest/Cloud_PubSub_to_Cloud_PubSub"
+ type = (known after apply)
}
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: POST /v1b3/projects/myproject/locations/us-central1/templates?alt=json&prettyPrint=false HTTP/1.1
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: {
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "environment": {
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "additionalUserLabels": {
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "app": "my_app",
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "env": "issue6961",
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "team": "hos"
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: },
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "tempLocation": "gs://coolbucket-dev/temp",
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "zone": "us-central1-a"
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: },
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "gcsPath": "gs://dataflow-templates-us-central1/latest/Cloud_PubSub_to_Cloud_PubSub",
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: "jobName": "ssue6961hos-terraform-job"
2020-08-12T19:51:51.074Z [DEBUG] plugin.terraform-provider-google_v3.34.0_x5: }
Hi @andrewhou-zonar! I believe the problem here is that you are missing a parameters block in your google_dataflow_job resource. The error
Error: googleapi: Error 400: The template parameters are invalid., badRequest.
implies this, although it uses "invalid" instead of saying the parameters are missing. This error is straight from the API. According to the guide for the particular Dataflow template you are using, you'll need a inputSubscription and outputTableSpec parameter. You should be able to specify this in your dataflow resource like this:
parameters = {
inputSubscription = google_pubsub_subscription.<your-pubsub-subscription>.id
outputTableSpec = "<your-project>:<your-dataset>.<your-table>"
}
When you set zone = "us-west1", you were passing a region value as a zone. Comically, the API attempted to parse a region from the zone string by truncating the last two characters, thus returning us-wes in the error message.
Adding your parameters block along with the correct zone (such as us-west1-a) should yield a successful deployment. I'll close this for now, but let me know otherwise!
This worked, thank you @c2thorn !
I'm going to lock this issue because it has been closed for _30 days_ ⏳. This helps our maintainers find and focus on the active issues.
If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 [email protected]. Thanks!
Most helpful comment
This worked, thank you @c2thorn !