python-docs-samples datastore/schedule-export has a bug

Created on 2 Dec 2020  路  7Comments  路  Source: GoogleCloudPlatform/python-docs-samples

In which file did you encounter the issue?

https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/datastore/schedule-export/main.py

Did you change the file? If so, how?

No.

Describe the issue

The line 37 has a bug. The eventvariable has no such key as 'data'.
json_data = json.loads(base64.b64decode(event['data']).decode('utf-8'))
Replace the line with:
json_data = event
and it works.

datastore help wanted p2 samples feature request

All 7 comments

You can replace the code here as well:
https://cloud.google.com/datastore/docs/schedule-export#export-all-entities

Hi! again.


    if 'data' in dict.keys():
        # if it is triggered by a Cloud Scheduler:
        json_data = json.loads(base64.b64decode(event['data']).decode('utf-8'))
    else:
        # Otherwise
        json_data = event;

Hi @matteo-mazzanti the code in this repository and the code on that page are one and the same. The sample is in the context of configuring cloud scheduler, So I am not certain this is a bug.

1) How are you triggering this if not via cloud scheduler?
2) Is there a portion of the guide that led you to run this outside of that context?

For now I am moving this to a feature request. If there is feedback that we do recommend using this outside of cloud scheduler, this is a bug, otherwise it is an ask to make this sample more general and to support other forms of invocation.

As the snippet is referenced by a tutorial, probably a user case is that the learner will try triggering the Cloud Function from the Cloud Function Console, before setting up a Cloud Scheduler.

This is a reasonable point.

The current comment does state clearly that this is expected

event (dict): event[data] must contain a json object encoded in
            base-64. Cloud Scheduler encodes payloads in base-64 by default.
            Object must include a 'bucket' value and can include 'kinds'
            and 'namespaceIds' values.

That said, we can make this handle being passed just data, which would likely be common if triggering via Google Cloud Console.

if "data" in dict: # Triggered via Cloud Scheduler, decode the inner data field of the json payload. json_data = json.loads(base64.b64decode(event['data']).decode('utf-8')) else: # Otherwise, for instance if triggered via the Cloud Console on a Cloud Function, the event is the data. json_data = event;

Hi,
This worked for me:

if "data" in event:
    # Triggered via Cloud Scheduler, decode the inner data field of the json payload.
    json_data = json.loads(base64.b64decode(event['data']).decode('utf-8'))
else:
    # Otherwise, for instance if triggered via the Cloud Console on a Cloud Function, the event is the data.
    json_data = event;
Was this page helpful?
0 / 5 - 0 ratings