Is this a BUG REPORT or FEATURE REQUEST?: FEATURE REQUEST
It would be nice if Argo considered providing high level tooling to help compose workflows.
Reading through the list of issues, I found the following related posts:
Would it be possible to provide libraries in other languages (e.g. Python) to help compose the yaml? Has this already been thought about? If yes, what would be the expected look and feel?
The approach we are taking with regards to workflow composability and support for other languages is:
Generate a OpenAPI spec for the workflow data types. This is already done and is available under api/openapi-spec/swagger.json (just the definitions, not paths). With the OpenAPI spec, it should now be possible to generate clients/models in other languages.
Investigate ksonnet as the recommended tooling for composition and reusable workflow parts. To this end, I've been working with ksonnet to generate an argo libsonnet library from the OpenAPI spec. The idea is an official argo libsonnet would be made available and versioned in an official capacity. Users would use this library, along with their own business-specific libsonnets, to get reusable workflow components.
Thanks Jesse.
Is it possible to use the current swagger.json to generate the yaml? Or is it intended only to make calls to endpoint?
It should now be possible to generate the models/class definitions in any language using swagger-codegen and passing argo/api/openapi-spec/swagger.json as the input. Typically in swagger, these models would be used for making API calls, but I think they would also be useful in generating workflow JSON/YAML, since you would be able construct the workflow object in a programmatic fashion. I think your milage may vary depending on the language.
Thanks Jesse. I will give it a try.
Just a heads up for anyone attempting this. The argo/api/openapi-spec/swagger.json need to be supplemented with the kubernetes/openapi-spec/swagger.json in order for swagger-codegen to generate usable models, because of all the k8s data type re-use that argo uses in the workflow spec.
Does anyone have an example of building an Argo pipeline using the python libraries generated via openapi?
Can I ask if anyone is actively building the swagger types? I've been trying to build a usable python setup to facilitate some workflow generation for my team that's been working with an increasingly brittle dockerized airflow deployment, but I've running into issues with missing specifications (notably the WorkflowStatus type). I found this guy had apparently run into the same thing and just added it into his build: https://github.com/edwardgeorge/argo/commit/5fb51d8c7c309dfaae66e268c51318db2a2df4c4
In any case I love you guys. This project is exceedingly dope. At some point I'll try to contribute something more than pointing out a little-used type omission. I do think that it'd be pretty easy to add some testing for the swagger build. At least in python this could have been caught trivially by running the swagger-codegen and importing the top-level module.
Thanks!
I tried creating the client for python using swagger and supplied the k8s spec as well.
The repo is here: https://github.com/swiftdiaries/argo_client
I'm not sure about using it to create Workflows, I can't do import argo_clientafter installing it.
Maybe I'm doing something wrong?
I did this to create the client and install it
git clone https://github.com/swiftdiaries/argo_client
cd argo_client
java -jar swagger-codegen-cli.jar generate \
-i argo_swagger.json \
-i k8s_swagger.json \
-l python \
-o /var/tmp/argo_client \
-DpackageName=argo_client
cd /var/tmp/argo_client
python setup.py install
python -c `import argo_client`
Looks like this discussion concluded with the above information
Most helpful comment
Does anyone have an example of building an Argo pipeline using the python libraries generated via openapi?