I'm able to deploy it but I keep getting this error:
"DistributionNotFound: The 'gcloud' distribution was not found and is required by the application"
Hello @gpopovic!
@dhermes has a good skeleton project that might help you.
https://github.com/dhermes/test-gcloud-on-gae
It looks like there is some extra work in supporting gcloud-python as a dependency.
https://github.com/dhermes/test-gcloud-on-gae/blob/master/install_gcloud.sh
thanks @daspecster , but isn't this too much hack for something that should be simple?
gcloud-golang works like as charm on both appengine standard and compute engine instances.
@dhermes or @tseaver could probably shed more light on this.
Here is @dhermes post on stackoverflow talking about some of the issues.
http://stackoverflow.com/a/28095663/89702
Do you have a stacktrace from "DistributionNotFound"?
@daspecster
Traceback (most recent call last):
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 240, in Handle
handler = _config_handle.add_wsgi_middleware(self._LoadHandler())
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 299, in _LoadHandler
handler, path, err = LoadObject(self._handler)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 85, in LoadObject
obj = __import__(path[0])
File "/base/data/home/apps/s~project-io/internal:1.393710744803082856/run.py", line 3, in <module>
from app import app
File "/base/data/home/apps/s~project-io/internal:1.393710744803082856/app/__init__.py", line 6, in <module>
from gcloud import datastore
File "/base/data/home/apps/s~project-io/internal:1.393710744803082856/lib/gcloud/__init__.py", line 19, in <module>
__version__ = get_distribution('gcloud').version
File "/base/data/home/apps/s~project-io/internal:1.393710744803082856/lib/pkg_resources/__init__.py", line 535, in get_distribution
dist = get_provider(dist)
File "/base/data/home/apps/s~project-io/internal:1.393710744803082856/lib/pkg_resources/__init__.py", line 415, in get_provider
return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
File "/base/data/home/apps/s~project-io/internal:1.393710744803082856/lib/pkg_resources/__init__.py", line 943, in require
needed = self.resolve(parse_requirements(requirements))
File "/base/data/home/apps/s~project-io/internal:1.393710744803082856/lib/pkg_resources/__init__.py", line 829, in resolve
raise DistributionNotFound(req, requirers)
DistributionNotFound: The 'gcloud' distribution was not found and is required by the application
Can someone explain the rationale behind the error here? The darth vendor path does work, but it's way more code that I'd expect just to get a hello world with gcloud-python....
Because App Engine standard doesn't truly support third-party packages. We work around this with vendoring. It's semi-officially supported by darth being included as google.appengine.ext.vendor.
See here.
The rest of the hacks are just get this package and its dependencies to play nicely with GAE standard.
Just a thought, what if we put these kind of issues in an FAQ or something? I guess we could just leave here but I think it might be useful to have these issues summarized in one spot.
@jonparrott is the authority on installing packages for GAE
Assign this to me, I'll add GAE Standard installation instructions to the documentation.
@dhermes how can I help you remove the two hacks (metadata server check, pkg_resources)?
Which hacks are you referring to (link?)?
@jonparrott The metadata hack was just to avoid the HTTP hit (slowdown). The pkg_resources hack should be resolved in our library. You know packaging better than I, is there a way for pkg_resources.get_distribution to work correctly on GAE?
I feel like pkg_resources should just work, I'll investigate a bit and report back.
Sometimes I wonder if my pursuit of packaging sanity in App Engine will make me lose my own sanity.
I hope not, I wish robots would fix packaging instead of beating humans in Go.
Update: pytz will be available in 1.9.40. The pwd module is also going to be enabled in an upcoming release, possibly 1.9.41.
Closing this issue as I can confirm that gcloud-python works via vendoring on GAE standard.
pytz, pwd, etc. should make things "better", but are not strictly needed.
I take that back, there's a deployment issue. Ugh.
Any updated information that could be provided on this would be helpful. Are there any short term workarounds? Or, do I need to migrate off of appengine standard sooner rather than later?
This was actually working for me briefly, but the issue I am running into currently is:
from gcloud import datastore
File "/base/data/home/apps/s~REDACTED/lib/gcloud/datastore/__init__.py", line 53, in <module>
from gcloud.datastore.batch import Batch
File "/base/data/home/apps/s~REDACTED/lib/gcloud/datastore/batch.py", line 24, in <module>
from gcloud.datastore import helpers
File "/base/data/home/apps/s~REDACTED/lib/gcloud/datastore/helpers.py", line 24, in <module>
from google.type import latlng_pb2
File "/base/data/home/apps/s~REDACTED/lib/google/type/latlng_pb2.py", line 78, in <module>
import grpc
File "/base/data/home/apps/s~REDACTED/lib/grpc/__init__.py", line 37, in <module>
from grpc._cython import cygrpc as _cygrpc
ImportError: dynamic module does not define init function (initcygrpc)
natb1: supporting standard is a priority.
You might be able to get around this by just deleting grpc from lib/
no dice. thanks though
from gcloud import datastore
File "/base/data/home/apps/s~REDACTED/lib/gcloud/datastore/__init__.py", line 53, in <module>
from gcloud.datastore.batch import Batch
File "/base/data/home/apps/s~REDACTED/lib/gcloud/datastore/batch.py", line 24, in <module>
from gcloud.datastore import helpers
File "/base/data/home/apps/s~REDACTED/lib/gcloud/datastore/helpers.py", line 24, in <module>
from google.type import latlng_pb2
File "/base/data/home/apps/s~REDACTED/lib/google/type/latlng_pb2.py", line 78, in <module>
import grpc
ImportError: No module named grpc
I also tried removing google from lib/
from gcloud import pubsub
File "/base/data/home/apps/s~REDACTED/lib/gcloud/pubsub/__init__.py", line 26, in <module>
from gcloud.pubsub.client import Client
File "/base/data/home/apps/s~REDACTED/lib/gcloud/pubsub/client.py", line 19, in <module>
from gcloud.client import JSONClient
File "/base/data/home/apps/s~REDACTED/lib/gcloud/client.py", line 20, in <module>
from gcloud._helpers import _determine_default_project
File "/base/data/home/apps/s~REDACTED/lib/gcloud/_helpers.py", line 28, in <module>
from google.protobuf import timestamp_pb2
ImportError: No module named protobuf
locking down googleapis_common_protos appears to do the trick (I think). I.E. this should work to deploy gcloud-python to appengine standard using vendoring
gcloud==0.18.1
googleapis_common_protos==1.2.0
Thanks, @natb1. @bjwatson do you anticipate that moving grpc to an 'extra' for googleapis-common-protos will fix this?
@jonparrott Yes, I do. The extra version is on testpypi, and I will publish to normal PyPI today after I talk with the gcloud-python team.
@jonparrott @natb1 @dhermes The latest release of googleapis-common-protos now has gRPC as an extra dependency, which should fix the latest issue.
Thanks @bjwatson!
No problem. Sorry for the drama this caused.
There may be another issue with google.types.latlng_pb2... investigating.
tests/conftest.py:39: in app
'DATA_BACKEND': request.param
bookshelf/__init__.py:69: in create_app
from .crud import crud
bookshelf/crud.py:15: in <module>
from bookshelf import get_model, oauth2, storage, tasks
bookshelf/tasks.py:20: in <module>
import psq
.tox/py34/lib/python3.4/site-packages/psq/__init__.py:25: in <module>
from .queue import BroadcastQueue, Queue
.tox/py34/lib/python3.4/site-packages/psq/queue.py:26: in <module>
from .storage import Storage
.tox/py34/lib/python3.4/site-packages/psq/storage.py:19: in <module>
from gcloud import datastore
.tox/py34/lib/python3.4/site-packages/gcloud/datastore/__init__.py:53: in <module>
from gcloud.datastore.batch import Batch
.tox/py34/lib/python3.4/site-packages/gcloud/datastore/batch.py:24: in <module>
from gcloud.datastore import helpers
.tox/py34/lib/python3.4/site-packages/gcloud/datastore/helpers.py:24: in <module>
from google.type import latlng_pb2
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='google/type/latlng.proto',
package='google.type',
syntax='proto3',
serialized_pb=_b('\n\x18google/type/latlng.proto\x12\x0bgoogle.type\"-\n\x06LatLng\x12\x10\n\x08latitude\x18\x01 \x01(\x01\x12\x11\n\tlongitude\x18\x02 \x01(\x01\x42&\n\x0f\x63om.google.typeB\x0bLatLngProtoP\x01\xa2\x02\x03GTPb\x06proto3')
)
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_LATLNG = _descriptor.Descriptor(
name='LatLng',
full_name='google.type.LatLng',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='latitude', full_name='google.type.LatLng.latitude', index=0,
number=1, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='longitude', full_name='google.type.LatLng.longitude', index=1,
number=2, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=41,
serialized_end=86,
)
DESCRIPTOR.message_types_by_name['LatLng'] = _LATLNG
LatLng = _reflection.GeneratedProtocolMessageType('LatLng', (_message.Message,), dict(
DESCRIPTOR = _LATLNG,
__module__ = 'google.type.latlng_pb2'
# @@protoc_insertion_point(class_scope:google.type.LatLng)
))
_sym_db.RegisterMessage(LatLng)
DESCRIPTOR.has_options = True
DESCRIPTOR._options = _descriptor._ParseOptions(descriptor_pb2.FileOptions(), _b('\n\017com.google.typeB\013LatLngProtoP\001\242\002\003GTP'))
> import grpc
E ImportError: No module named 'grpc'
.tox/py34/lib/python3.4/site-packages/google/type/latlng_pb2.py:78: ImportError
@bjwatson @dhermes any ideas?
@jonparrott How did you trigger this?
Never mind, I know what the problem is. We need to only use the --grpc_out flag when building operations_pb2.py, but not for anything else. I'll release version 1.3.3 of googleapis-common-protos in an hour or two, once I've made this fix.
I'm really sorry for all the trouble with this transition.
No worries. This is why we have a lot of tests for things like
getting-started-python. :)
On Wed, Sep 7, 2016, 12:12 PM Brian J. Watson [email protected]
wrote:
Never mind, I know what the problem is. We need to only use the --grpc_out
flag when building operations_pb2.py, but not for anything else. I'll
release version 1.3.3 of googleapis-common-protos
https://pypi.python.org/pypi/googleapis-common-protos in an hour or
two, once I've made this fix.I'm really sorry for all the trouble with this transition.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1893#issuecomment-245385481,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAPUc2yF4LetiG-xcAY-CqPcCvw_A5-Sks5qnwyKgaJpZM4I8yRt
.
Perhaps I should run some of those tests before doing anything crazy like this again. :)
Version 1.3.3 should fix this issue: https://pypi.python.org/pypi/googleapis-common-protos
@bjwatson it can be somewhat difficult to run our system tests, but let me know if you ever want to try. :)
I'll pay you a visit the next time I'm in Seattle, and maybe you can show me. :)
I'm a little confused about what's going on with https://github.com/googleapis/packman/pull/109, maybe am missing some context.
As I understand, the problem is that we don't want to import grpc when it's not necessary to access the LRO service because grpc is not available in some environments, like GAE. But if we build common protos with grpc_out even just for LRO, you still have the exact same problem for the Operations messages, right?
That is, we still want operations_pb2 to be accessible from GAE, since GAE clients it will still need the Operation type defined in that module, even if they cannot access the LRO service. But if we include the service in googleapis-common-protos, GAE will not even be able to load the Operation type because it will choke on the grpc import.
So maybe we need a way to make the import grpc conditional on the environment not being GAE.
@jonparrott @geigerj Version 1.3.4 of https://pypi.python.org/pypi/googleapis-common-protos fixes the remaining issue with importing operations_pb2.py if gRPC is not installed.
@jonparrott Are we good to close this out?
Nope. There remains a few more issues, notably the issue with invalid character name.
the issue with invalid character name
Which issue would that be?
Currently when you vendor gcloud-python you'll get an error when you try to deploy because setuptools contains a file with ( in the name. I'm working to get this resolved upstream.
I think if we're going to support app engine, we should probably consider writing a tool that properly vendors in the library, perhaps something that uses the GAE-provided versions of pytz and such.
First I've heard about the setuptools issue. Is that something we can fix?
FWIW we don't require pytz and ship a simplified UTC class for those that don't have it.
That's good news. And no, I don't think there's anything we should do to
address the filename problem.
On Fri, Sep 9, 2016, 3:51 PM Danny Hermes [email protected] wrote:
First I've heard about the setuptools issue. Is that something we can fix?
FWIW we don't require pytz and ship a simplified UTC class for those that
don't have it.—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1893#issuecomment-246062328,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAPUc_IKsAuPqGSdUnI4h14HY-WVfDNWks5qoeLTgaJpZM4I8yRt
.
Hi, I'm having issues when vendoring google-cloud on the GAE dev server.
I'm using virtualenv so install google-cloud with pip install google-cloud -t lib/. Have no issues importing bigquery but I get the ImportError: No module named pwd when calling to bigquery.Client()
Pinning down to the version 2.2.0 of the oauth2client doesn't fix it. I'm using latest version of the GAE SDK
@layoaster Our only use of pwd is in core/google/cloud/_helpers.py, which has a conditional import:
# NOTE: Catching this ImportError is a workaround for GAE not supporting the
# "pwd" module which is imported lazily when "expanduser" is called.
try:
_USER_ROOT = os.path.expanduser('~')
except ImportError: # pragma: NO COVER
_USER_ROOT = None
The oauth2client issue is here: https://github.com/google/oauth2client/issues/578
@layoaster Can you provide a stacktrace?
If @layoaster is using dev_appserver, this is unsurprising. They fixed os.expanduser on production GAE, but haven't yet updated the GAE SDK to support it.
@jonparrott Without a stacktrace, I'm still surprised. If you notice the code @tseaver posted above, the only ImportError that would happen is avoided by our code.
Yeah but oauth2client also makes a call.
I would believe that. Would be very easy to believe with a stacktrace :grinning:
Sorry for not providing it before @dhermes @jonparrott
Traceback (most recent call last):
File "/home/layo/google-cloud-sdk/platform/google_appengine/google/appengine/runtime/wsgi.py", line 267, in Handle
result = handler(dict(self._environ), self._StartResponse)
File "/home/layo/google-cloud-sdk/platform/google_appengine/lib/webapp2-2.3/webapp2.py", line 1519, in __call__
response = self._internal_error(e)
File "/home/layo/google-cloud-sdk/platform/google_appengine/lib/webapp2-2.3/webapp2.py", line 1511, in __call__
rv = self.handle_exception(request, response, e)
File "/home/layo/google-cloud-sdk/platform/google_appengine/lib/webapp2-2.3/webapp2.py", line 1505, in __call__
rv = self.router.dispatch(request, response)
File "/home/layo/google-cloud-sdk/platform/google_appengine/lib/webapp2-2.3/webapp2.py", line 1253, in default_dispatcher
return route.handler_adapter(request, response)
File "/home/layo/google-cloud-sdk/platform/google_appengine/lib/webapp2-2.3/webapp2.py", line 1077, in __call__
return handler.dispatch()
File "/home/layo/google-cloud-sdk/platform/google_appengine/lib/webapp2-2.3/webapp2.py", line 547, in dispatch
return self.handle_exception(e, self.app.debug)
File "/home/layo/google-cloud-sdk/platform/google_appengine/lib/webapp2-2.3/webapp2.py", line 545, in dispatch
return method(*args, **kwargs)
File "/home/layo/cloud-samples/vendoring/main.py", line 8, in get
client = bigquery.Client()
File "/home/layo/cloud-samples/vendoring/lib/google/cloud/client.py", line 186, in __init__
Client.__init__(self, credentials=credentials, http=http)
File "/home/layo/cloud-samples/vendoring/lib/google/cloud/client.py", line 122, in __init__
credentials = get_credentials()
File "/home/layo/cloud-samples/vendoring/lib/google/cloud/credentials.py", line 87, in get_credentials
return client.GoogleCredentials.get_application_default()
File "/home/layo/cloud-samples/vendoring/lib/oauth2client/client.py", line 1288, in get_application_default
return GoogleCredentials._get_implicit_credentials()
File "/home/layo/cloud-samples/vendoring/lib/oauth2client/client.py", line 1273, in _get_implicit_credentials
credentials = checker()
File "/home/layo/cloud-samples/vendoring/lib/oauth2client/client.py", line 1226, in _implicit_credentials_from_files
credentials_filename = _get_well_known_file()
File "/home/layo/cloud-samples/vendoring/lib/oauth2client/client.py", line 1392, in _get_well_known_file
default_config_dir = os.path.join(os.path.expanduser('~'),
File "/home/layo/cloud-samples/vendoring/lib/python2.7/posixpath.py", line 261, in expanduser
import pwd
File "/home/layo/google-cloud-sdk/platform/google_appengine/google/appengine/tools/devappserver2/python/sandbox.py", line 963, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named pwd
Hi,I am having a problem using the library. I need to log on to the development server (dev_appserver) on my local machine, but I have not founda how to import google-cloud and always get the same error: ImportError: No module named google.cloud.datastore
I've set it in the "lib" folder like all other third party libraries. Is there any way to use it for development? thanks
This is the stacktrace:
ERROR 2016-10-26 11:24:19,880 wsgi.py:263]
Traceback (most recent call last):
File "/home/enzo/TRABAJO/gcloud-sdk/platform/google_appengine/google/appengine/runtime/wsgi.py", line 240, in Handle
handler = _config_handle.add_wsgi_middleware(self._LoadHandler())
File "/home/enzo/TRABAJO/gcloud-sdk/platform/google_appengine/google/appengine/runtime/wsgi.py", line 299, in _LoadHandler
handler, path, err = LoadObject(self._handler)
File "/home/enzo/TRABAJO/gcloud-sdk/platform/google_appengine/google/appengine/runtime/wsgi.py", line 85, in LoadObject
obj = __import__(path[0])
File "/home/enzo/TRABAJO/booking/sources/service/main.py", line 9, in <module>
from blueprint.booking import booking
File "/home/enzo/TRABAJO/booking/sources/service/blueprint/booking.py", line 11, in <module>
from utilities.db import getConnector
File "/home/enzo/TRABAJO/booking/sources/service/blueprint/utilities/db.py", line 3, in <module>
from google.cloud import datastore
File "/home/enzo/TRABAJO/gcloud-sdk/platform/google_appengine/google/appengine/tools/devappserver2/python/sandbox.py", line 999, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named google.cloud.datastore
The google namespace is the issue, have you done this using the GAE vendor tool, or manually?
i did it with GAE vendor tool.
This is my appengine_config.py
from google.appengine.ext import vendor
import os
# insert `lib` as a site directory so our `main` module can load
# third-party libraries, and override built-ins with newer
# versions.
vendor.add('lib')
if os.environ.get('SERVER_SOFTWARE', '').startswith('Development'):
import imp
import os.path
from google.appengine.tools.devappserver2.python import sandbox
sandbox._WHITE_LIST_C_MODULES += ['_ssl', '_socket']
# Use the system socket.
psocket = os.path.join(os.path.dirname(os.__file__), 'socket.py')
imp.load_source('socket', psocket)
@jonparrott Does vendor also patch namespaces?
@dhermes vendor uses addsitedir which processes .pth files.
I don't see any usage of addsitedir in the appengine_config.py above
@dhermes source for vendor
I can repro the pwd issue when trying to init a pubsub client:
Relevant stacktrace:
File "/callback.py", line 15, in callback
client = pubsub.Client()
File "/venv/lib/python2.7/site-packages/google/cloud/pubsub/client.py", line 74, in __init__
super(Client, self).__init__(project, credentials, http)
File "/venv/lib/python2.7/site-packages/google/cloud/client.py", line 162, in __init__
_ClientProjectMixin.__init__(self, project=project)
File "/venv/lib/python2.7/site-packages/google/cloud/client.py", line 118, in __init__
project = self._determine_default(project)
File "/venv/lib/python2.7/site-packages/google/cloud/client.py", line 131, in _determine_default
return _determine_default_project(project)
File "/venv/lib/python2.7/site-packages/google/cloud/_helpers.py", line 180, in _determine_default_project
_, project = google.auth.default()
File "/venv/lib/python2.7/site-packages/google/auth/_default.py", line 277, in default
credentials, project_id = checker()
File "/venv/lib/python2.7/site-packages/google/auth/_default.py", line 111, in _get_gcloud_sdk_credentials
_cloud_sdk.get_application_default_credentials_path())
File "/venv/lib/python2.7/site-packages/google/auth/_cloud_sdk.py", line 79, in get_application_default_credentials_path
config_path = get_config_path()
File "/venv/lib/python2.7/site-packages/google/auth/_cloud_sdk.py", line 56, in get_config_path
os.path.expanduser('~'), '.config', _CONFIG_DIRECTORY)
File "/venv/lib/python2.7/posixpath.py", line 261, in expanduser
import pwd
File "/Users/erlichmen/google-cloud-sdk/platform/google_appengine/google/appengine/tools/devappserver2/python/sandbox.py", line 963, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named pwd
@jonparrott This is from google-auth. Did we not account for this?
@dhermes I didn't think of it because the os.path.expanduser issues should now be fixed both in production and with the cloud sdk. @erlichmen can you tell me which version of the App Engine SDK you're using?
@jonparrott
gcloud version
Google Cloud SDK 137.0.1
alpha 2016.01.12
app-engine-java 1.9.46
app-engine-python 1.9.40
app-engine-python-extras 1.9.40
beta 2016.01.12
bq 2.0.24
bq-nix 2.0.24
cloud-datastore-emulator 1.2.1
core 2016.12.08
core-nix 2016.11.07
gcd-emulator v1beta3-1.0.0
gcloud
gsutil 4.22
gsutil-nix 4.18
kubectl
kubectl-darwin-x86_64 1.4.6
pubsub-emulator 2016.08.19
Also:
gcloud components update
All components are up to date.
Apparently that fix did not yet make it into dev_appserver. I anticipate it'll be in the next release.
In the meantime @erlichmen, you can use a workaround in appengine_config.py:
import os
os.path.expanduser = lambda path: path
@jonparrott
I solved it by putting:
env_variables:
CLOUDSDK_CONFIG: /
in the app.yaml
@erlichmen nice!
I believe I just came across this bug and opened an SO question here. Is there any documentation on using the Google Cloud Library with the App Engine SDK? This may become a much more common use case as more APIs get transitioned to the REST interface.
I'm not sure exactly how you're setting up your application, but this might be of some use.
https://github.com/GoogleCloudPlatform/python-docs-samples/tree/master/appengine/flexible/datastore
@daspecster Thanks, I've seen that documentation, but that isn't quite right. The documentation you sent is for setting up Google Cloud Library on Google App Engine Flexible without the compat runtimes. The link you sent does not use the App Engine SDKs at all, so there is no naming conflict issue.
Rather, I would like to setup Google Cloud Library on Google App Engine Standard. Or alternatively, to use Google Cloud Library on Google App Engine Flexible with the python-compat runtime. These both require using the Google App Engine SDK.
@speedplane Have you seen https://cloud.google.com/appengine/docs/python/tools/using-libraries-python-27#installing_a_third-party_library?
You may also need seem wrangling of google.__path__, but I think vendor handles that. @jonparrott (the author of vendor, who also works on GAE) may contradict me
Yes, my SO question was answered, and it looks like vendor needs to be used. I have not yet tested it, but I will in the next few days. I wonder if these instructions can be added to the README.md?
Good to hear
It could be interesting to make a guide. But it might be better to put it in the examples that @jonparrott has?
Hi I am trying to use google cloud python client to improve the performance of our BigQuery queries on our standard app engine app.
When I follow the documentation on https://cloud.google.com/appengine/docs/python/tools/using-libraries-python-27#installing_a_third-party_library to install https://googlecloudplatform.github.io/google-cloud-python/ I get the below error when trying to import bigquery.
Using this over Google APIs Python Client (https://developers.google.com/api-client-library/python/) could reduce our load times greatly so any help would be appreciated.
working directory
$ ls -l -1d app.yaml appengine_config.py lib
-rw-r--r-- 1 tcross tcross 805 Jan 5 12:55 appengine_config.py
-rw-r--r-- 1 tcross tcross 5325 Jan 5 09:58 app.yaml
drwxr-xr-x 100 tcross tcross 12288 Jan 5 12:51 lib
install google cloud python
pip install --upgrade -t lib google-cloud
appengine_config.py
from google.appengine.ext import vendor
vendor.add('lib')
Import library
from google.cloud import bigquery
runtime error
ERROR 2017-01-05 18:54:11,797 wsgi.py:263]
Traceback (most recent call last):
File "/home/tcross/Downloads/google_appengine/google/appengine/runtime/wsgi.py", line 240, in Handle
handler = _config_handle.add_wsgi_middleware(self._LoadHandler())
File "/home/tcross/Downloads/google_appengine/google/appengine/runtime/wsgi.py", line 299, in _LoadHandler
handler, path, err = LoadObject(self._handler)
File "/home/tcross/Downloads/google_appengine/google/appengine/runtime/wsgi.py", line 85, in LoadObject
obj = __import__(path[0])
File "/home/tcross/development/csgapi/portals.py", line 6, in <module>
from base import BaseHandler, EntityHandler, CollectionHandler
File "/home/tcross/development/csgapi/base.py", line 12, in <module>
from controllers.quote_history.quote_history_module import QuoteHistoryModule
File "/home/tcross/development/csgapi/controllers/quote_history/quote_history_module.py", line 2, in <module>
from google.cloud import bigquery
File "/home/tcross/development/csgapi/lib/google/cloud/bigquery/__init__.py", line 26, in <module>
from google.cloud.bigquery._helpers import ArrayQueryParameter
File "/home/tcross/development/csgapi/lib/google/cloud/bigquery/_helpers.py", line 21, in <module>
from google.cloud._helpers import _date_from_iso8601_date
File "/home/tcross/Downloads/google_appengine/google/appengine/tools/devappserver2/python/sandbox.py", line 1001, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named google.cloud._helpers
@jonparrott @dhermes is the namespace getting overwritten somehow?
@chmoder In production this is working. On the App Engine dev-server you can use virtualenv to avoid PATH/libs conflicts.
Thanks for testing and the tip @layoaster
Slight update: I'm working with @omaray to prepare some preliminary recommendations for using this library on App Engine standard. As several of you have discovered, there are many edge cases due to the way App Engine handles third-party libraries and the google namespace.
@jonparrott One issue I ran into that you may want to consider: most of this library works through a REST interface (or at least an HTTP interface). App Engine Standard uses urlfetch for HTTP access which is not very performant. I got the datastore API working but it was incredibly slow. I believe this was because urlfetch is not as efficient as maintaining an open socket connection.
@speedplane we're aware that urlfetch is not the ideal transport. We're looking into possibilities of alternative transports.
@speedplane on App Engine Standard you can interact with the Datastore via the NDB library. Why do you need to use the (REST) Datastore API ?
@layoaster I'm aware of the other APIs for accessing the datastore. I have modules that operate on App engine Standard and App Engine Flexible, and they use different APIs, making code maintenance and testing more painful. Would be nice if Google could provide a single API that works everywhere for the datastore.
@jonparrott any updates on this? This library has a much better support for Storage and BigQuery than any other library available in the moment...
Storage and BigQuery should work on app engine. Let me know if you have
installation issues.
On Sun, Feb 12, 2017, 9:30 AM Matheus notifications@github.com wrote:
@jonparrott https://github.com/jonparrott any updates on this? This
library has a much better support for Storage and BigQuery than any other
library available in the moment...—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1893#issuecomment-279233903,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAPUcxCbsg1DaEgVPyCpJoBeNZWIeU9bks5rb0HAgaJpZM4I8yRt
.
Are the google (cloud) endpoints compatible with the cloud-sdk?
File "/Users/nilleb/dev/project/app/server/libs/google_endpoints/endpoints/apiserving.py", line 74, in <module>
from google.api.control import client as control_client
ImportError: No module named control
appengine_config.py:
vendor.add('server/libs/google-cloud-sdk')
vendor.add('server/libs/google_endpoints')
@nilleb it's a differently library but it should work. Your call to vendor.add doesn't seem right- you should just point it at the top-level server/libs folder (or whichever folder you specify when running pip install -t {folder}
@jonparrott the vendor.add() are right, even if they look strange.
In facts, I was trying to import google.cloud.spanner
File "/Users/nilleb/dev/project/app/server/libs/google_endpoints/google/cloud/spanner/__init__.py", line 18, in <module>
from google.cloud.spanner.client import Client
File "/Users/nilleb/dev/project/app/server/libs/google_endpoints/google/cloud/spanner/client.py", line 28, in <module>
from google.gax import INITIAL_PAGE
File "/Users/nilleb/dev/project/app/server/libs/google_endpoints/google/gax/__init__.py", line 35, in <module>
import multiprocessing as mp
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/__init__.py", line 65, in <module>
from multiprocessing.util import SUBDEBUG, SUBWARNING
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/util.py", line 41, in <module>
from subprocess import _args_from_interpreter_flags
ImportError: cannot import name _args_from_interpreter_flags
This is a known bug (Reference: https://github.com/googleapis/gax-python/issues/149)
If you try to get past this error with this dirty hack:
class DummyProcess(object):
def start(self, target):
target()
class DummyProcessing(ModuleType):
def __init__(self):
pass
@staticmethod
def Process(target):
return DummyProcess(target)
@staticmethod
def Queue():
return Queue()
sys.modules['multiprocessing'] = DummyProcessing
You just fail with another error
File "/Users/nilleb/dev/project/app/server/libs/google_endpoints/dill/__init__.py", line 27, in <module>
from .dill import dump, dumps, load, loads, dump_session, load_session, \
File "/Users/nilleb/dev/project/app/server/libs/google_endpoints/dill/dill.py", line 68, in <module>
import __main__ as _main_module
ImportError: Cannot re-init internal module __main__
You can solve this one with an easy
sys.modules['dill'] = pickle
So, if you've installed your google-cloud-spanner module with a command line similar to
pip install --upgrade -t . google-cloud google-cloud-core==0.23.0 google-cloud-spanner
You will get a working google-cloud-spanner in a GAE instance :-)
https://github.com/googleapis/gax-python/issues/149
@nilleb are you staying that spanner works after this?
@jonparrott using the hack above, on my local development server, I am able to create databases and insert data. Not tested in production (just a POC).
@nilleb I see, it definitely won't work in production. I'm surprised that it works locally.
@jonparrott do you have an idea of when it will be ready for production?
Le 17 févr. 2017 6:47 PM, "Jon Wayne Parrott" notifications@github.com a
écrit :
@nilleb https://github.com/nilleb I see, it definitely won't work in
production. I'm surprised that it works locally.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1893#issuecomment-280718567,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ADVmx3yqRdWORNj4Twmqr-zivxsuy_Ryks5rdd0egaJpZM4I8yRt
.
I can't speak to timelines, but it's on our radar.
Pretty excited to use spanner in production but lack of client library support in standard appengine env is blocking.
I'm having a similar problem using the cloud vision api in GAE standard:
Traceback (most recent call last):
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 240, in Handle
handler = _config_handle.add_wsgi_middleware(self._LoadHandler())
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 299, in _LoadHandler
handler, path, err = LoadObject(self._handler)
File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 96, in LoadObject
__import__(cumulative_path)
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/core/main.py", line 9, in <module>
from api import * # noqa
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/core/api.py", line 8, in <module>
from google.cloud import vision
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/vendor/google/cloud/vision/__init__.py", line 21, in <module>
from google.cloud.vision.client import Client
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/vendor/google/cloud/vision/client.py", line 22, in <module>
from google.cloud.vision._gax import _GAPICVisionAPI
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/vendor/google/cloud/vision/_gax.py", line 17, in <module>
from google.cloud.gapic.vision.v1 import image_annotator_client
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/vendor/google/cloud/gapic/vision/v1/image_annotator_client.py", line 31, in <module>
from google.gax import api_callable
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/vendor/google/gax/__init__.py", line 39, in <module>
from grpc import RpcError, StatusCode
File "/base/data/home/apps/s~SOMETHING/20170405t102043.400345193010976560/vendor/grpc/__init__.py", line 37, in <module>
from grpc._cython import cygrpc as _cygrpc
ImportError: dynamic module does not define init function (initcygrpc)
@lukesneeringer this is still an open (and important) issue.
Thank you for this thread. I've been trying for a few days to execute a batch job on Google Dataflow using the Python API with Beam. It kept giving me this error similar to above:
(3e59e0af2f2af432): Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 705, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 445, in _load_main_session
pickler.load_session(session_file)
File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module
return getattr(__import__(module, None, None, [obj]), obj)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/storage/__init__.py", line 38, in <module>
from google.cloud.storage.blob import Blob
File "/usr/local/lib/python2.7/dist-packages/google/cloud/storage/blob.py", line 42, in <module>
from google.cloud.iam import Policy
ImportError: No module named iam
Setting a hard version as someone noted earlier in the thread in my setup.py for google-cloud-core did the trick:
...,'google-cloud-core==0.24.1',...
I got a similar error.
python 2.7
In [1]: from neuroglancer.pipeline.volumes import gcloudvolume
---------------------------------------------------------------------------
DistributionNotFound Traceback (most recent call last)
<ipython-input-1-2fe83029a6c1> in <module>()
----> 1 from neuroglancer.pipeline.volumes import gcloudvolume
/usr/people/jingpeng/workspace/neuroglancer/python/neuroglancer/pipeline/__init__.py in <module>()
1 from neuroglancer._mesher import Mesher
----> 2 from storage import Storage
3 from precomputed import Precomputed, EmptyVolumeException
4 from task_queue import TaskQueue, RegisteredTask
5 from tasks import *
/usr/people/jingpeng/workspace/neuroglancer/python/neuroglancer/pipeline/storage.py in <module>()
8 from glob import glob
9 import google.cloud.exceptions
---> 10 from google.cloud.storage import Client
11 import boto
12 from boto.s3.connection import S3Connection
/usr/people/jingpeng/workspace/neuroglancer/python/jingpengw/lib/python2.7/site-packages/google/cloud/storage/__init__.py in <module>()
33
34 from pkg_resources import get_distribution
---> 35 __version__ = get_distribution('google-cloud-storage').version
36
37 from google.cloud.storage.batch import Batch
/usr/people/jingpeng/lib/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py in get_distribution(dist)
555 dist = Requirement.parse(dist) 556 if isinstance(dist, Requirement):
--> 557 dist = get_provider(dist)
558 if not isinstance(dist, Distribution):
559 raise TypeError("Expected string, Requirement, or Distribution", dist)
/usr/people/jingpeng/lib/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py in get_provider(moduleOrReq)
429 """Return an IResourceProvider for the named module or requirement"""
430 if isinstance(moduleOrReq, Requirement):
--> 431 return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
432 try:
433 module = sys.modules[moduleOrReq]
/usr/people/jingpeng/lib/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py in require(self, *requirements)
966 included, even if they were already activated in this working set.
967 """
--> 968 needed = self.resolve(parse_requirements(requirements))
969
970 for dist in needed:
/usr/people/jingpeng/lib/anaconda2/lib/python2.7/site-packages/setuptools-27.2.0-py2.7.egg/pkg_resources/__init__.py in resolve(self, requirements, env, installer, replace_conflicting)
852 if dist is None:
853 requirers = required_by.get(req, None)
--> 854 raise DistributionNotFound(req, requirers)
855 to_activate.append(dist)
856 if dist not in req:
DistributionNotFound: The 'google-cloud-storage' distribution was not found and is required by the application
@jingpengw That typically means the package metadata is missing, which is a sign that not all files were copied.
@dhermes you mean the metadata of google-cloud-python or my local neuroglancer?
In this instance I mean the metadata of google-cloud-storage (which is the package referenced in your stacktrace).
@dhermes thanks. I downloaded the code and reinstalled with python setup install. it works fine now.
new django 1.11 was added, but pytz module is missing
even so it mentioned in built-in-libraries-27
An update on this:
We are working with the App Engine team to bring support for these libraries to App Engine standard, however, it is a lot of engineering effort and we can't yet speak to timelines. In the meantime, these libraries are not officially supported on App Engine standard.
The recommended way to talk to Google APIs from App Engine standard remains the Google API Client Library.
You can read more about our client libraries on this page.
I don't know if this is still needed. I've just translated the curl example to plain python and be able to use the vision api on google app engine standard environment. Here is my gist.
https://gist.github.com/skoegl/a322a3b66f997ef0591c5a48f5da872c
Have fun!
Stefan
Possible to get any new updates about this? @jonparrott
We are still working on this. My recommendations in https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1893#issuecomment-322512490 are still representative of the current state of the world.
Any new updates on this?
We are making progress but I still can not announce a timeline for when this will be ready. You are welcome to go ahead and try some of the libraries on standard and report any issues you run into here.
In this regards I would like to reiterate that the “thread of shame” (
https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1893#issuecomment-367651481)
is over a year old and still no end in sight on when we will have a
consistent and working libs on GAE/py standard.
Also as said before I think there should be only one package for gcloud
with a single version single change log, etc.
On Thu, 22 Feb 2018 at 19:14 Jon Wayne Parrott notifications@github.com
wrote:
We are making progress but I still can not announce a timeline for when
this will be ready. You are welcome to go ahead and try some of the
libraries on standard and report any issues you run into here.—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1893#issuecomment-367752532,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAqOb96QS8vkrxfamlc4RBmNee2egJkrks5tXaB0gaJpZM4I8yRt
.
In this regards I would like to reiterate that the “thread of shame” (#1893 (comment)) is over a year old and still no end in sight on when we will have a consistent and working libs on GAE/py standard.
We have limited resources on both the Client Libraries team and the App Engine team. We are working to make this happen, but there are other things that are often higher in our list of priorities. We understand this is a important case for users, and lots of work is happening in the App Engine runtime to support these libraries (notably, we recently added gRPC to App Engine runtime). We've close, but there's still work to be done.
Also as said before I think there should be only one package for gcloud
with a single version single change log, etc.
Can you expand on this?
Might something like this be a stop-gap "solution"?
try:
import pkg_resources
def get_distribution_dummy(name):
class DummyObj(object):
version = 'unknown'
return DummyObj()
pkg_resources.get_distribution = get_distribution_dummy
logging.debug('disabled `pkg_resources.get_distribution() for GAE compability`')
except ImportError:
pass
Seems pkg_resources.get_distribution() currently is only used as an convoluted way to get the current version and most libraries should also work without that.
google-cloud-bigquery==0.24.0 works for me this way.
@jonparrott Thanks for pointing here but I am still not sure about #5012's relation to this issue as in my case package six is being imported successfully but showing exception while importing http_client by from six.moves import http_client.
I am trying to use google cloud vision v1p1pbeta1 in the GAE standard environment. This is the error that i get when i try to import from google.cloud import vision_v1p1beta1 as vision:
First time after deploying the application, I get this error:
(/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py:263)
Traceback (most recent call last):
File "/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 240, in Handle
handler = _config_handle.add_wsgi_middleware(self._LoadHandler())
File "/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 299, in _LoadHandler
handler, path, err = LoadObject(self._handler)
File "/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 85, in LoadObject
obj = __import__(path[0])
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t181713.410189522651190980/app.py", line 4, in <module>
from submitReview import SubmitReview
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t181713.410189522651190980/submitReview.py", line 8, in <module>
from google.cloud import vision_v1p2beta1 as vision
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t181713.410189522651190980/lib/google/cloud/vision_v1p2beta1/__init__.py", line 22, in <module>
from google.cloud.vision_v1p2beta1.gapic import image_annotator_client as iac
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t181713.410189522651190980/lib/google/cloud/vision_v1p2beta1/gapic/image_annotator_client.py", line 18, in <module>
import google.api_core.gapic_v1.client_info
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t181713.410189522651190980/lib/google/api_core/gapic_v1/__init__.py", line 16, in <module>
from google.api_core.gapic_v1 import config
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t181713.410189522651190980/lib/google/api_core/gapic_v1/config.py", line 23, in <module>
import grpc
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t181713.410189522651190980/lib/grpc/__init__.py", line 22, in <module>
from grpc._cython import cygrpc as _cygrpc
ImportError: dynamic module does not define init function (initcygrpc)
Subsequently, from the second time onwards, I get this error:
python
(/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py:263)
Traceback (most recent call last):
File "/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 240, in Handle
handler = _config_handle.add_wsgi_middleware(self._LoadHandler())
File "/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 299, in _LoadHandler
handler, path, err = LoadObject(self._handler)
File "/base/alloc/tmpfs/dynamic_runtimes/python27/277b61042b697c7a_unzipped/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 85, in LoadObject
obj = __import__(path[0])
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t172858.410188745370110763/app.py", line 4, in <module>
from submitReview import SubmitReview
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t172858.410188745370110763/submitReview.py", line 9, in <module>
from google.cloud import vision_v1p1beta1 as vision
File "/base/data/home/apps/s~cs410c-vasaikar/20180603t172858.410188745370110763/lib/google/cloud/vision_v1p1beta1/__init__.py", line 20, in <module>
from google.cloud.vision_v1p1beta1 import types
ImportError: cannot import name types
To import Google Cloud Storage on App Engine standard I use the following monkey patching:
import requests_toolbelt.adapters.appengine
requests_toolbelt.adapters.appengine.monkeypatch()
import pkg_resources
old_get_distribution = pkg_resources.get_distribution
def mp_get_distribution(*args, **kwargs):
# print args
# print kwargs
try:
res = old_get_distribution(*args, **kwargs)
except:
class Mock(object):
pass
mock = Mock()
mock.version = '1.9.0'
return mock
return res
pkg_resources.get_distribution = mp_get_distribution
from google.cloud import storage
But guys, we NEED a proper support for Google App Engine Standard. As with the old libraries it isn't even possible to generate download urls!!
It doesn't seem like using Google APIs with App Engine is possible. I've followed the above instructions and receive the following when using dev_appserver.py .:
in enable_sandbox
THIRD_PARTY_C_MODULES.get_importable_module_names(config))
File "/Users/ethan/google-cloud-sdk/platform/google_appengine/google/appengine/tools/devappserver2/python/runtime/sandbox.py", line 856, in __init__
dep_lib = __import__(dep_lib_name)
ImportError: No module named six
requirements.txt
Flask==0.12.2
google-api-python-client==1.6.5
google-auth==1.3.0
google-auth-httplib2==0.0.3
gunicorn==19.7.1
google-cloud-bigquery==0.30.0
google-cloud-storage==1.7.0
google-cloud-datastore==1.4.0
google-cloud-language==1.0.2
requests==2.18.4
requests-toolbelt==0.8.0
gcloud==0.18.3
googleapis_common_protos==1.5.3
main.py
import logging
from flask import Flask
app = Flask(__name__)
from google.cloud import storage
from google.cloud import language
@app.route('/')
def form():
return 'hi'
@app.errorhandler(500)
def server_error(e):
# Log the error and stacktrace.
logging.exception('An error occurred during a request.')
return 'An internal error occurred.', 500
appengine_config.py
from google.appengine.ext import vendor
vendor.add('lib')
app.yaml
runtime: python27
api_version: 1
threadsafe: true
handlers:
- url: /.*
script: main.app
libraries:
- name: flask
version: 0.12
- name: ssl
version: latest
- name: grpcio
version: latest
- name: six
version: latest
env_variables:
SERVICE_ACCOUNT_FILE : 'credentials/service_account.json'
BUCKET_NAME : 'gcp-nlp'
I downloaded all packages above into a lib folder with: pip install -t lib -r requirements.txt
I'm shocked at how difficult it is to use Google Language API with GAE. Any reason they wouldn't be compatible? What am I missing here?
@iethan Judging from your app.yaml file, you are using Google App Engine Standard Environment. You may need to use the Google Language API in GAE Standard this way:
Your appengine_config.py looks good to me.
In requirement.txt, you would want to use GoogleAppEngineCloudStorageClient==1.9.22.1 in place of google-cloud-storage==1.7.0. The former is good to use in GAE Standard, the latter is only good to use in GAE Flexible Environment. Check out this link and pay attention to how things are organized in the left panel on the webpage as you read.
Then in main.py, do something like this (more reference):
import googleapiclient.discovery
service = googleapiclient.discovery.build('language', 'v1')
request = service.documents().analyzeEntities(body={
"document": {
"type": 'PLAIN_TEXT',
"gcsContentUri": "gs://[BUCKET_NAME]/[FILE_NAME]}"
},
"encodingType": 'UTF8'
})
response = request.execute()
@anguillanneuf Thanks for the comment. I'm getting the following error:
ntime/sandbox.py", line 310, in enable_sandbox
THIRD_PARTY_C_MODULES.get_importable_module_names(config))
File "/Users/ethanlyon/google-cloud-sdk/platform/google_appengine/google/appengine/tools/devappserver2/python/runtime/sandbox.py", line 856, in __init__
dep_lib = __import__(dep_lib_name)
ImportError: No module named enum
I did a search on Google Cloud for the library you mentioned and am not seeing it. Am I missing something from the documentation?
@iethan It's definitely on the webpage. Let me grab a screenshot for you.

An update for y'all:
App Engine has launched its Python 3.7 runtime for Standard based on gVisor sandboxing technology. This means that these libraries (along with a significant amount of other libraries) are now supported on App Engine standard with that runtime. You should be able to simply add these to your requirements.txt and use them as normal.
At this time we have no plans to support these libraries on the Python 2.7 runtime on App Engine Standard.
@theacodes wow, this is the first time I've heard about gVisor and App Engine, it's pretty darn cool. For better or worse, Python 2.7 will be around for quite a while, so let me please give a +1 for porting this to the 2.7 environment.
We can use Google Big Query on Google Appengine standard environment?
Appengine standard environment supports only Python 2.7.
And can you tell me why standard environment only supports Python 2.7?
It has some limitation in pdf convert etc.
@linhui718611 : App Engine Standard announced a Python 3.7 runtime in August (see https://cloud.google.com/blog/products/gcp/introducing-app-engine-second-generation-runtimes-and-python-3-7). You should be able to use these libraries to talk to BigQuery from App Engine Standard when running on the Python 3.7 runtime.
If you have more questions on the functionality supported by various runtimes, it's probably better to ask on Stack Overflow than here (this issue is specifically about making sure these client libraries run on the 2.7 runtime, which currently isn't on the planned road map AFIAK).
How we can migrate Python 2.7 runtime to Python 3.7?
@linhui718611 : It'd probably be best to head over to www.stackoverflow.com and ask this same question there. This GitHub issue tracker isn't really the right spot to help you!
It would be great to have these libraries supported in standard 2.7. The new 3.7 standard still lacks things like NDB and other APIs.
@Thalius with the Python 2 EOL just a little over a year out, there is pretty much zero chance that we will be backporting support to GAE Standard 2.7 at this date.
@tseaver It makes sense, 2.7 is on it's way out. Luckily it won't be a problem for my current applications. I just hope we can get something like NDB and Taskqueue for the 3.7 standard environment.
@theacodes First off, thank you to both the Client Libraries and App Engine teams for their respective hard work.
Reviving something quite old here that I just read further up:
Also as said before I think there should be only one package for gcloud
with a single version single change log, etc.
Can you expand on this?
I don't know if this is what @erlichmen meant, but fom my perspective, a lot of frustration stems from the fact that https://cloud.google.com/apis/docs/client-libraries-explained even needs to exist. Add to that "special snowflake" client libs like this one: https://github.com/GoogleCloudPlatform/appengine-gcs-client.git
I have been developing App Engine apps for many years now, and I can honestly say that this has been one of the most confusing aspects all along (which @&#! client library to use and which particular set of instructions/docs to follow). It used to be that the various services/APIs were baked in (NDB, task queue, [blob]storage etc.) which made life really easy. I do appreciate the need to separate concerns here and have the different products and teams be able to move independently as far as possible. The restructuring of the platform and work that has gone into the new libs makes sense to me, it's just that in the wake of all this the developer experience has suffered. The different libs don't have the same features and sometimes it's not clear at all which should be used where and why.
For example, at my company we didn't even realise that google-cloud was not (officially) supported in the 2.7 standard environment. It took the deprecation of the all-inclusive Python package, which caused stuff* to break, which caused a re-investigation of what we are, and are supposed to be, using. Up to this point google-cloud seemed to finally be the single, modern client lib that covered all of the APIs and could be used everywhere, that I think many developers are wishing for.
It's not like we're living under a rock either. I'm subscribed to the Google Cloud newsletter and Cloud SDK update notifications. Often I feel like I'm wading through conflicting information from different doc pages or READMEs. Case in point: The top-level README for this repo states that these libs are not supported on the 2.7 runtime environment at all, but drill into one of the sub-components, for example storage, and now it sounds like the 2.7 runtime environment is supported at least until Jan 2020. Huh?
Anyway, a lot of good work is happening, keep it up and I'm sure things will get better. This may be the wrong place for this kind of feedback, but I thought seeing as you asked for expansion... I have also left feedback on this before in other ways. Please will you see that it reaches the relevant people?
*side note: we are still successfully using google-cloud (specifically google-cloud-storage since the all-inclusive package was deprecated) to talk to GCS in the 2.7 runtime environment. To get this to work we had to follow the instructions for configuring requests as detailed here. I don't know if google-cloud-storage is a special case in this regard - it seems like it might be and that it might take a lot more hoop-jumping to get other parts of the lib to work. We're also only doing uploads using the new lib so we're using a very limited set of functions.
@maltem-za thank you for your respectful, thoughtful feedback. We do really appreciate it.
I definitely agree things are more confusing today that before, and it really has a lot to do with the scale. When Google Cloud started (before it was Google Cloud, even), we had one real product (App Engine) that supported three languages. Within that product, we had a handful of services. This is a much easier developer experience problem to tackle than the one we face today: we have to support ~50 cloud products across ~8 languages on many platforms. As far as I know, we're the only people attempting to do something at this scale, a despite being Google we are limited on resources, so we're going to have some missteps.
for example storage, and now it sounds like the 2.7 runtime environment is supported at least until Jan 2020. Huh?
I think we can clarify that a bit, @crwilcox. The intent here is that:
side note: we are still successfully using google-cloud (specifically google-cloud-storage since the all-inclusive package was deprecated) to talk to GCS in the 2.7 runtime environment.
I'm glad it works for you. There's several workarounds that let people use them, but we determined that these workarounds aren't suitable for us to tell all of our users to adopt and the performance and reliability after applying these workarounds isn't something we feel comfortable supporting. If it works for you, great! just know that we can't really support it.
(Coincidentally, Storage is indeed easier to hack to work in App Engine standard. The gRPC-based clients are much much harder)
Thank you so much @xcash I've finally found the solution after struggling for 2 days tracing & bugging the error. I faced the problem of DistributionNotFound: The 'google-api-core' distribution was not found and is required by the application
Then I come up with @xcash solution by changing this file & line
File "/base/data/home/apps/s~citric-aleph-611/20190413t015248.417442685994206727/lib/google/api_core/__init__.py", line 23, in
__version__ = get_distribution("google-api-core").version
import pkg_resources__version__ = pkg_resources.get_distribution("google-api-core").version
into
import requests_toolbelt.adapters.appengine
requests_toolbelt.adapters.appengine.monkeypatch()
import pkg_resources
old_get_distribution = pkg_resources.get_distribution
def mp_get_distribution(*args, **kwargs):
try:
res = old_get_distribution(*args, **kwargs)
except:
class Mock(object):
pass
mock = Mock()
mock.version = '1.9.0'
return mock
return res
pkg_resources.get_distribution = mp_get_distribution
Then I faced the error about
PermissionDenied: 403 IAM permission 'dialogflow.sessions.detectIntent' on 'projects/XXXX-live/agent' denied.
Then I come up with @iethan solution to set the env:
env_variables:
GOOGLE_APPLICATION_CREDENTIALS : "xxx-en-ffb0659d5d9e.json"
SERVICE_ACCOUNT_FILE : "xxx-en-ffb0659d5d9e.json"
Thank you guys again!
It's 2020, and (during quarantine) I went to dig out my old App Engine Python 2 apps to move them to the latest Cloud libraries and encountered both these problems (DistributionNotFound: The 'google-cloud-ndb' distribution was not found and is required by the application and ImportError: dynamic module does not define init function (initcygrpc)) as well as a 3rd (ImportError: No module named pkg_resources). Based on a combination of reading the docs, SO, and generic Google search, I resolved all 3 issues with:
grpcio and setuptools libraries in app.yaml:libraries:
- name: grpcio
version: 1.0.0
- name: setuptools
version: 36.6.0
pkg_resources to appengine_config.py so the whole thing looks like:import pkg_resources
from google.appengine.ext import vendor
# Set PATH to your libraries folder.
PATH = 'lib'
# Add libraries installed in the PATH folder.
vendor.add(PATH)
# Add libraries to pkg_resources working set to find the distribution.
pkg_resources.working_set.add_entry(PATH)
Most helpful comment
I hope not, I wish robots would fix packaging instead of beating humans in Go.