When I user global variable workflow.name with spaces in after double curly braces, it doesn't get evaluated and gets injected as string.
What Kubernetes provider are you using?
What version of Argo Workflows are you running?
v2.7.6
metadata:
name: lovely-bear
spec:
entrypoint: whalesay
templates:
- name: whalesay
container:
name: main
image: 'docker/whalesay:latest'
command:
- env
env:
- name: WORKFLOW_NAME
value: "{{workflow.name}}"
- name: WORKFLOW_NAME_WITH_SPACE
value: "{{ workflow.name }}"
Paste the logs from the workflow controller:
$ argo logs lovely-bear
lovely-bear: 2020-11-16T15:21:31.834402501Z PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
lovely-bear: 2020-11-16T15:21:31.834447201Z HOSTNAME=lovely-bear
lovely-bear: 2020-11-16T15:21:31.834451901Z WORKFLOW_NAME=lovely-bear
lovely-bear: 2020-11-16T15:21:31.834455001Z WORKFLOW_NAME_WITH_SPACE={{ workflow.name }}
...
Message from the maintainers:
Impacted by this bug? Give it a 馃憤. We prioritise the issues with the most 馃憤.
https://github.com/argoproj/argo/issues/4484
This seems to work correctly, although it is the second time this has been mentioned. I see you're using a very old version (2.7) can you upgrade to latest and see if it works?
4484
This seems to work correctly, although it is the second time this has been mentioned. I see you're using a very old version (2.7) can you upgrade to latest and see if it works?
Hey @simster7 thanks for replying. I tried with v2.11.7 and it did not evaluate variable with space.
PS: I'd like to work on it once its decided that its a bug
I am still unable to reproduce:
$ argo submit env.yaml
Name: lovely-bear-bnx79
Namespace: argo
ServiceAccount: default
Status: Pending
Created: Mon Nov 16 14:53:32 -0600 (now)
Progress:
$ kubectl logs lovely-bear-bnx79 | grep "WORKFLOW_NAME"
WORKFLOW_NAME=lovely-bear-bnx79
WORKFLOW_NAME_WITH_SPACE=lovely-bear-bnx79
Not sure why. Are you using helm at all? Can you tell me more about your environment?
Not sure why. Are you using helm at all? Can you tell me more about your environment?
Argo with v2.7.6 is deployed using helm chart. I used https://argoproj.github.io/argo/quick-start/#install-argo-workflows to install local k8s cluster which is version v2.11.7.
Can you please try with following workflow?
metadata:
name: wonderful-bear
namespace: argo
labels:
example: 'true'
spec:
entrypoint: argosay
templates:
- name: argosay
container:
name: main
image: docker/whalesay:latest
command:
- env
env:
- name: WF
value: "{{workflow.name}}"
- name: WF_S
value: "{{ workflow.name }}"
ttlStrategy:
secondsAfterCompletion: 300
podGC:
strategy: OnPodCompletion
Output is

Take a look at https://github.com/argoproj/argo/issues/2430#issuecomment-632862338 for a helm workaround if you submit Workflows with helm charts.
Can you please try with following workflow?
$ cat env.yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
name: wonderful-bear
namespace: argo
labels:
example: 'true'
spec:
entrypoint: argosay
templates:
- name: argosay
container:
name: main
image: docker/whalesay:latest
command:
- env
env:
- name: WF
value: "{{workflow.name}}"
- name: WF_S
value: "{{ workflow.name }}"
ttlStrategy:
secondsAfterCompletion: 300
$ ./dist/argo submit env.yaml
Name: wonderful-bear
Namespace: argo
ServiceAccount: default
Status: Pending
Created: Mon Nov 16 18:57:19 -0600 (now)
Progress:
$ pl wonderful-bear | grep WF
WF=wonderful-bear
WF_S=wonderful-bear
@simster7 sorry I forgot to mention that I am using argo UI to submit workflow. I tried your workflow but, I still get same thing
HOSTNAME=wonderful-bear
WF=wonderful-bear
WF_S={{ workflow.name }}
Take a look at #2430 (comment) for a helm workaround if you submit Workflows with helm charts.
thanks for the link. I use this for other workflow which uses helm chart.
I forgot to mention that I am using argo UI to submit workflow
Good point, but still the same result
I forgot to mention that I am using argo UI to submit workflow
Good point, but still the same result
ha, only thing different is the version. I am using v2.11.7. Do you happen to know in which version it was solved? I tried it again and saw same result 馃
You're right! It doesn't work in 2.11.7, I'll try to find the commit that fixes this so I can backport to 2.11.next
I identified the PR responsible for the fix: #4310. It was merged while I was OOO, so I wasn't aware of it. I have backported it to 2.11 and it will be in the next release.
I identified the PR responsible for the fix: #4310. It was merged while I was OOO, so I wasn't aware of it. I have backported it to 2.11 and it will be in the next release.
Thanks @simster7