Not that I'm aware at least! we're not working on it at Transloadit. wld you like to implement a plugin for that, perhaps? :D
Hi! We are working on docs on how to make plugins, and since Uppy is flexible like that, it should be fairly easy to implement, so if you鈥檇 like to give it a try, we are here to help.
Ok guys, we've implemented the URL signing for Google. Seems the "AWSS3" plugin works perfectly with Google's resumable uploads. Maybe rename the plugin, or add to the documentation?
Thanks for making this awesome package!
Thank you! Added this to todo, we鈥檒l think about naming 馃憣 Could you send some of your usage/signing our way? If there鈥檚 something we could use in docs, for example. Thanks!
I don't think we need to rename it, since Google Cloud Storage probably just copied the API from S3 to make it easier to switch between the two. Adding an example to the docs seems great though. We could do one for DigitalOcean's new object storage thing too, that also mimicks S3's API: https://www.digitalocean.com/products/spaces/
@ogtfaber, any news on how to do with Google?
@ogtfaber I have problem with signature:
<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>
The request signature we calculated does not match the signature you provided. Check your Google secret key and signing method.
</Message>
<StringToSign>
PUT
image/jpeg
1518399402
/mybucket.appspot.com/7d5e4aad1e3a737fb8d2c59571fdb980.jpg
</StringToSign>
</Error>
https://github.com/GoogleCloudPlatform/google-cloud-ruby/issues/1964
@johnunclesam Did you find any solution for this signature problem?
i've been banging my head on the wall for about a week trying to make uppy and uppy-server work with Google Cloud Storage and it won't. The only way i've managed it is to do a native post upload with my own custom signing server. Even this breaks with the AwsS3 plugin because GCS returns the wrong content-type.
I'm really trying to not have to write a whole bunch of custom code to enable large uploads on front-end. My platform is all based on GCS. So if anyone has got uppy-server to work via interoperability with Google Cloud Storage please share the steps.
Note i've done the CORS and interoperability steps already when messing around with fineuploader
Here's what i'm seeing in my most recent attempts with AwsS3 plugin + uppy-server + GCS interoperability:
Request
Request URL: https://storage.googleapis.com/_[BUCKET_NAME]_
Request Method: POST
Request Body
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="acl"
public-read
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="key"
blursample.png
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="success_action_status"
201
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="content-type"
image/png
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="bucket"
_[BUCKET_NAME]_
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="X-Amz-Algorithm"
AWS4-HMAC-SHA256
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="X-Amz-Credential"
GOOGRRXSJZVFQMEGWXIM36VP/20180610/us-east-1/s3/aws4_request
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="X-Amz-Date"
20180610T165659Z
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="Policy"
eyJleHBpcmF0aW9uIjoiMjAxOC0wNi0xMFQxNzowMTo1OVoiLCJjb25kaXRpb25zIjpbeyJhY2wiOiJwdWJsaWMtcmVhZCJ9LHsia2V5IjoiYmx1cnNhbXBsZS5wbmcifSx7InN1Y2Nlc3NfYWN0aW9uX3N0YXR1cyI6IjIwMSJ9LHsiY29udGVudC10eXBlIjoiaW1hZ2UvcG5nIn0seyJidWNrZXQiOiJ5bG1lbWJfYXR0YWNobWVudHMifSx7IlgtQW16LUFsZ29yaXRobSI6IkFXUzQtSE1BQy1TSEEyNTYifSx7IlgtQW16LUNyZWRlbnRpYWwiOiJHT09HUlJYU0paVkZRTUVHV1hJTTM2VlAvMjAxODA2MTAvdXMtZWFzdC0xL3MzL2F3czRfcmVxdWVzdCJ9LHsiWC1BbXotRGF0ZSI6IjIwMTgwNjEwVDE2NTY1OVoifV19
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="X-Amz-Signature"
dbad2160f44e22fc97c3a64b488c3f231f17d3ef3117d3ced934fc6503be4f61
------WebKitFormBoundaryczUeAxXc3kTN0hfA
Content-Disposition: form-data; name="file"; filename="blursample.png"
Content-Type: image/png
------WebKitFormBoundaryczUeAxXc3kTN0hfA--`
Response (from GCS)
<Error>
<Code>
AccessDenied
</Code>
<Message>
Access denied.
</Message>
<Details>
Anonymous caller does not have storage.objects.create access to ylmemb_attachments.
</Details>
</Error>
It doesnt look like it has any idea about the AWS-style POST form.
Here's what my env config looks like for uppy-server:
export NODE_ENV="${NODE_ENV:-development}"
export DEPLOY_ENV="${DEPLOY_ENV:-production}"
export UPPYSERVER_PORT=3020
export UPPYSERVER_DOMAIN="localhost"
export UPPYSERVER_SELF_ENDPOINT="localhost:3020"
export UPPYSERVER_PROTOCOL="http"
export UPPYSERVER_DATADIR="/tmp"
export UPPYSERVER_SECRET="secret"
export UPPYSERVER_AWS_KEY="GOOG*****************" // these are filled out in real life
export UPPYSERVER_AWS_SECRET="yJuRL**************+c"
export UPPYSERVER_AWS_BUCKET="_[BUCKET_NAME]_"
export UPPYSERVER_AWS_ENDPOINT="https://storage.googleapis.com/"
Here's how uppy is instantiated in React code:
componentWillMount() {
this.uppy = new Uppy({
autoProceed: true,
id: "uppy"
}).use(AwsS3, {
host: "http://localhost:3020"
}).run();
}
Not trying to do anything over the top crazy here. Just trying to get the basics working, with GCS. What am i missing?
~j
@jimyaghi did you ever get this working?
No I didn't manage unfortunately and switched to another library which also gave me trouble but I think I got that one working. It's been a while though I can't remember what the other library was. It looks like support for gcs in upload libraries very much relies on its ability to emulate s3 as being the more popular.
Got this to work!
Needed to add repsonseHeader to the cors.json config, which is used to set cors on the bucket via gsutil:
[
{
"origin": ["https://localhost:5000"],
"method": ["GET", "PUT"],
"responseHeader": ["Content-Type"],
"maxAgeSeconds": 3000
},
{
"origin": ["*"],
"method": ["GET"],
"maxAgeSeconds": 3000
}
]
Im currently signing my own urls with a "custom" companion (below) but I will try this again with the AWS companion and see if it works...
uppy-companion-google.js
import { Storage } from '@google-cloud/storage';
// Check for required env variables
if (!process.env.GOOGLE_APPLICATION_CREDENTIALS) {
throw new Error(
'Missing Google Cloud credentials, please set the GOOGLE_APPLICATION_CREDENTIALS environment variable to your credentials.json location'
);
}
if (!process.env.COMPANION_GOOGLE_BUCKET) {
throw new Error(
'Missing bucket, please set the COMPANION_GOOGLE_BUCKET environment variable'
);
}
// Create new storage client
const storage = new Storage();
// Express middleware to return a signed url
const getSignedUrl = (bucket, ...options) => ({ body, headers }, res) => {
// Get bucket reference from env variable
const myBucket = storage.bucket(
bucket || process.env.COMPANION_GOOGLE_BUCKET
);
// Get file reference
const file = myBucket.file(body.filename);
// Merge config with default
const config = {
action: 'write',
contentType: body.contentType,
expires: Date.now() + 1000 * 60 * 60, // 1 hour from now
...options,
};
//-
// Generate a URL to allow write permissions. This means anyone with this
// URL can send a PUT request with new data that will overwrite the file.
//-
file.getSignedUrl(config).then(function(data) {
res.json({
method: 'put',
url: data[0],
fields: {},
headers: { 'content-type': body.contentType },
});
});
};
export { getSignedUrl };
express app
const { getSignedUrl } = require('./uppy-companion-google');
...
app.use('/getSignedUrl', cors(), bodyParser.json(), getSignedUrl());
...
react app
...
this.uppy.use(AwsS3, {
limit: 1,
timeout: 1000 * 60 * 60,
getUploadParameters(file) {
// Send a request to our signing endpoint.
return fetch(process.env.REACT_APP_GRAPHQL_ENDPOINT + '/getSignedUrl', {
method: 'post',
// Send and receive JSON.
headers: {
accept: 'application/json',
'content-type': 'application/json',
},
body: JSON.stringify({
filename: file.name,
contentType: file.type,
}),
}).then(response => response.json());
},
});
...

Thank you @danielmahon
You saved my day.
@danielmahon we are on GCP, do resumable uploads to GCS also work ? and also we need to upload the data via a proxy (https_proxy/http_proxy), does it work ?
/cc @ifedapoolarewaju
@danielmohan Can u plz confirm if u guys were referring about gcs multipart upload instead of upload to gce in one shot.
If it's GCS multipart upload, I will go ahead and give a try as I have requirement to upload files upto 10 gigs in size through browser directly to GCS
@rajivchodisetti I haven鈥檛 personally used it with resumeable uploads yet so I could be wrong but I don鈥檛 think it would be a problem as the google cloud client supports it, you would just need to make sure to setup uppy properly as well. You could probably also use the tus version as well. I am currently using the aws plugin for image/media uploads to google, and the tus plugin for video uploads to Vimeo.
Since it's been reported to work I'll close this issue, feel free to re-open however!
Most helpful comment
Got this to work!
Needed to add
repsonseHeaderto thecors.jsonconfig, which is used to set cors on the bucket viagsutil:Im currently signing my own urls with a "custom" companion (below) but I will try this again with the AWS companion and see if it works...
uppy-companion-google.js
express app
react app