Google-cloud-python: Dataproc: submit job does not support GCP regions

Created on 4 Sep 2018  路  14Comments  路  Source: googleapis/google-cloud-python

Submitting dataproc job with specified GCP region fails.

When executing:

from google.cloud import dataproc_v1 as dataproc

client = dataproc.JobControllerClient()
job = {
            'placement': {
                'cluster_name': someCluster
            },
            'spark_job': {
                'main_class': 'com.example.Main',
                'main_jar_file_uri': 'gs://someBucket/someObject',
                'args': ['args1', 'args2']
            }
        }
new_job = client.submit_job(some_project, 'europe-west1', job)

it gives

<_Rendezvous of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "Region 'europe-west1' invalid or not supported by this endpoint; permitted regions: '[global]'"
debug_error_string = "{"created":"@1536060753.242822959","description":"Error received from peer","file":"src/core/lib/surface/call.cc","file_line":1095,"grpc_message":"Region 'europe-west1' invalid or not supported by this endpoint; permitted regions: '[global]'","grpc_status":3}"

Tested on python 2.7 and 3.5

question backend dataproc

Most helpful comment

Once can also follow hint from documentation and use client_options:

ClusterControllerClient(
    credentials=credentials,
    client_info=client_info,
    client_options={
        'api_endpoint': '{}-dataproc.googleapis.com:443'.format(location)
    })

https://googleapis.dev/python/dataproc/latest/gapic/v1/api.html#google.cloud.dataproc_v1.ClusterControllerClient

All 14 comments

The error message says:

Region 'europe-west1' invalid or not supported by this endpoint; permitted regions: '[global]'

What region did you use when creating the cluster?

@tseaver europe-west1

@theacodes Can you please loop in the appropriate PoC for dataproc?

I've pinged @JustinBeckwith for a PoC for this issue.

/cc @texasmichelle Can you please look into this issue?

Hello,

Got the same error when i try to create a dataproc cluster on europe-west-1

Caused by: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Region '' invalid or not supported by this endpoint; permitted regions: '[global]'

I do it in Scala with CreateClusterRequest.newBuilder.setRegion('europe-west1')

Any solution?

Perhaps @aniket486 is the right PoC for Dataproc API back-end issues?

@aniket486

Caused by: com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Region 'europe-west1' invalid or not supported by this endpoint

Why only "global" region is permitted actually?

I'm not sure, but i think there is the same problem on setZone, we can't use europe-west-1b zone for exemple.

Caused by: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Multiple validation errors:

  • Invalid value for field 'zone': 'europe-west-1b'. Unknown zone.

Any update on this ? i got the same error ?

in raise_from google.api_core.exceptions.InvalidArgument: 400 Region 'us-east4' specified in request does not match endpoint region 'global'. To use 'us-east4' region, specify 'us-east4' region in request and configure client to use 'us-east4-dataproc.googleapis.com:443' endpoint.

Hello, is there an update on this at all? I'm currently encountering the same issue, using the list_clusters() method.

I encountered same issue in region "asia-southeast1".

It looks like using non-global regions requires tweaking the transport used by the client, e.g.:

from google.cloud.dataproc_v1 import ClusterControllerClient
from google.cloud.dataproc_v1.gapic.transports.cluster_controller_grpc_transport import (
    ClusterControllerGrpcTransport)

def client_for_region(region):
    region_endpoint = '{}-dataproc.googleapis.com:443'.format(region)
    transport = ClusterControllerGrpcTransport(address=region_endpoint)
    return ClusterControllerClient(transport)

If using that workaround clears up the issue for y'all, then we should add an additional constructor / factory which makes it easy.

Once can also follow hint from documentation and use client_options:

ClusterControllerClient(
    credentials=credentials,
    client_info=client_info,
    client_options={
        'api_endpoint': '{}-dataproc.googleapis.com:443'.format(location)
    })

https://googleapis.dev/python/dataproc/latest/gapic/v1/api.html#google.cloud.dataproc_v1.ClusterControllerClient

@nuclearpinguin Thanks for the follow-up: indeed, the client_options argument is the new method for supporting alternate endpoints.

Was this page helpful?
0 / 5 - 0 ratings