Google Cloud Storage as storage adapter
Give the option to store files on Google Cloud Storage
As I seen, there is already a pull request with this feature: https://github.com/directus/api/issues/1349
I'd help but the feature it's already implemented on the mentioned pull request
To achieve better clarity/visibility, we are now tracking feature requests within the Feature Request project board.
This issue being closed does not mean it's not being considered.
@bjgajjar — any thoughts on the PR mentioned?
I've the same problem and I've verified that is possible to use Google Storage as custom S3 Bucket.
Workaround
Go to Google Storage Setting > Interoperability > Create a key for a service account
Generate a key id and a secret.
Use the key in S3 storage ( Docker ):
DIRECTUS_STORAGE_ADAPTER: "s3"
DIRECTUS_STORAGE_ROOT: "/images/"
DIRECTUS_STORAGE_ROOTURL: "https://storage.googleapis.com/<BUCKET_NAME>/images"
DIRECTUS_STORAGE_THUMBROOT: "/images/thumbs/"
DIRECTUS_STORAGE_KEY: "<KEY_GENERATED>"
DIRECTUS_STORAGE_SECRET: "<SECRET_GENERATED>"
DIRECTUS_STORAGE_REGION: "<REGION>"
DIRECTUS_STORAGE_BUCKET: "<BUCKET_NAME>"
DIRECTUS_STORAGE_ENDPOINT: "https://storage.googleapis.com"
Documentation
https://cloud.google.com/storage/docs/interoperability
@WoLfulus @bjgajjar — if this is verified working, should we update our Docs to officially support it using our current storage adapter?
I can confirm that the workaround of @EnricoFerro works like a charm 👍
I couldn't get this to work. Does this need a special setup in google cloud @nkcr ? (I was getting permission errors)
To get it worked locally I only created and set the service key as described. However to make it worked on production I had to additionally set the CORS permissions on my bucket.
Sorry to open this up but I'm running into permissions issue...
Any way you can tell which permissions you've had set on the serviceaccount used?
Mine is Storage Admin and Storage Object Admin and I'm still getting permission errors:
Directus\Filesystem\Exception\ForbiddenException: No permission to write: 9304044b-8575-4e5b-a025-f61b60301037.jpg in /var/directus/src/core/Directus/Filesystem/Filesystem.php:56
@nkcr - what permissions/roles did you have set?
@ThaDaVos just get the same error.
I am not sure what is wrong since Directus throws that Generic Exception when the adapter gives an error and doesn't log the original Exception :(
I only give "Storage access admin" to my user.
Please help.
@nkcr - what permissions/roles did you have set?
Here is what I set:
[
{
"origin": [
"https://yourwebsite.com",
],
"responseHeader": [
"Content-Type"
],
"method": [
"GET",
"HEAD",
"DELETE"
],
"maxAgeSeconds": 3600
}
]
I'm also getting the No permission to write error when using the GCS S3 interop feature. The permissions error comes when trying to upload a file using Directus so I don't think it's related to the bucket's CORS configuration, that would come after the file is actually successfully uploaded right?
I also verified that my HMAC key from google was working using the aws command line CLI configured to hit Google like so:
$ aws configure # enter the keys
$ aws s3 cp some-local-file.jpg s3://my-bucket --endpoint-url=https://storage.googleapis.com
and it worked just fine.
What did you folks set the
Ah I believe I figured this out. If you're using uniform access control permissions for your bucket Google won't let API clients set ACLs on created objects, which Directus tries to do. So, you must use fine grained access control permissions, and let Directus set those fine grained permissions as it creates objects. You can find that setting in the permissions tab of the bucket details in the cloud console.
I figured this out by hacking up a local directus docker container to log the actual error in the Filesystem class. That should probably not be supressed but my PHP is terrible such that I don't think I'm the right person to prepare PR. If anyone else is bored I am sure there could be a small logging improvement made to help us all save time!
@airhorns funny that I have done exactly the same debug process and get to the same conclusion. ;)
After changed to Fine Grained permissions everything starts working.
This error could have been easier to understand If Directus didn't supress the original adatper exception. at least should be logged.
I still get this error, when I try to upload files bigger than 10 Mb (or so). I assume, that there is point, where the AWS-Client starts chunking, which throws an error - I tried to var_dump the exception, thrown by ...->getAdapter()->writeStream(...) but ran into a endless loop.
BTW: I saw, the file was fully uploaded to /var/tmp, so my php settings are set correctly.
Does anybody else experience upload errors for bigger files?
Most helpful comment
I've the same problem and I've verified that is possible to use Google Storage as custom S3 Bucket.
Workaround
Go to Google Storage Setting > Interoperability > Create a key for a service account
Generate a key id and a secret.
Use the key in S3 storage ( Docker ):
Documentation
https://cloud.google.com/storage/docs/interoperability