Pipeline: PipelineResource for S3 buckets (for IBM COS, AWS S3, Minio ...)

Created on 11 Dec 2018  Â·  11Comments  Â·  Source: tektoncd/pipeline

Expected Behavior

Ability to use S3 for object storage as PipelineResources types

Actual Behavior

https://github.com/knative/build-pipeline/pull/321 exists for GCS but does not support S3

Additional Info

In multi-zone clusters, PVCs can be difficult and sometimes its prefered to transfer artifacts between tasks in a pipeline using other types of storage.

lifecyclrotten meaty-juicy-coding-work

Most helpful comment

btw this go library is a great way to work with all the different cloud blob storage provides (GCS, S3, Azure etc) https://github.com/google/go-cloud via https://github.com/google/go-cloud/tree/master/blob using a simple URL scheme

All 11 comments

I would like to contribute on this issue since we are on AWS+EKS. Considering https://github.com/knative/build-pipeline/blob/master/pkg/apis/pipeline/v1alpha1/gcs_resource.go it seems straight forward but any pointers would be nice, following Finding something to work on on CONTRIBUTING.md

Sounds good @mustafaakin ! I'm excited that you want to contribute :D!

(btw if you haven't already plz feel free to join us in slack at #build-pipeline !)

I think to properly support this we should expand our end to end tests to support using S3 as well (not to mention other clouds in general!), but I think there's probably a bit of work to do there so for now I think we should add this functionality, initially _not_ cover it with end to end tests, and create a separate issue around setting up infrastructure for end to end tests against s3.

(Any other thoughts @shashwathi @pivotal-nader-ziada @ImJasonH @tejal29 ?)

initially _not_ cover it with end to end tests

Or maybe a better idea: add an end to end test that is skipped by default, which folks can run manually?

btw this go library is a great way to work with all the different cloud blob storage provides (GCS, S3, Azure etc) https://github.com/google/go-cloud via https://github.com/google/go-cloud/tree/master/blob using a simple URL scheme

Sorry for the late reply, assuming we use go-cloud, what would be the way to go? Right now there is GCS Storage resource and it would be a duplicate.

What do you think about keeping a single StorageResource, but expanding it to support multiple backing stores? I think that's in-line with what we were thinking for #778 as well.

That does sound like a good diea @dlorenc - and maybe in that case it would make sense to try out go-cloud like @mustafaakin is suggesting? (I think I was initially opposed b/c I assumed there would be too many difference but now I'm thinking I was wrong)

As chance would have it, I had made a step and a small task/utility using go-cloud for S3 artifact diddling, so I reworked it into a Storage resource type-> https://github.com/tektoncd/pipeline/pull/1258

Stale issues rot after 30d of inactivity.
Mark the issue as fresh with /remove-lifecycle rotten.
Rotten issues close after an additional 30d of inactivity.
If this issue is safe to close now please do so with /close.

/lifecycle rotten

Send feedback to tektoncd/plumbing.

Two possible ways to handle this:

  • Make it possible for workspaces to be backed by something other than k8s types (e.g. GCS, S3)
  • Create Tasks for this and consider it done

There is also the PipelineResource redesign in #1673

Given all of that it feels reasonable to me to close this for now.

Was this page helpful?
0 / 5 - 0 ratings