I'm using the standard ansible/ansible-runner image with ansible version 2.9.
I'm following the guide to test a new container group. The jobs spawn in Kubernetes and there do not appear to be any connection issues, rather, the problem is a configuration issue with the private_data_dir that I'm trying to figure out.
When the ping ad-hoc job is launched I see the following output in awx-task container:
2020-12-18 15:07:54,919 INFO awx.isolated.manager.playbooks ansible-playbook 2.9.15
config file = /etc/ansible/ansible.cfg
configured module search path = ['/var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/plugins/isolated']
ansible python module location = /usr/lib/python3.6/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 3.6.8 (default, Aug 24 2020, 17:57:11) [GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]
Using /etc/ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/inventory/hosts.json as it did not pass its verify_file() method
script declined parsing /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/inventory/hosts.json as it did not pass its verify_file() method
auto declined parsing /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/inventory/hosts.json as it did not pass its verify_file() method
Parsed /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/inventory/hosts.json inventory source with yaml plugin
Loading callback plugin awx_display of type stdout, v2.0 from /var/lib/awx/venv/awx/lib/python3.6/site-packages/ansible_runner/callbacks/awx_display.py
PLAYBOOK: run_isolated.yml *****************************************************
Positional arguments: run_isolated.yml
verbosity: 4
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
inventory: ('/tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/inventory/hosts.json',)
extra_vars: ('@/tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/env/extravars',)
forks: 5
1 plays in run_isolated.yml
PLAY [Prepare data, dispatch job in isolated environment.] *********************
META: ran handlers
The important bit is: /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/env/extravar where we can see the format of the private_data_dir is /tmp/<job-id><suffix>/<playbook-name><suffix2>/*.
The data is copied in the "synchronize job environment with remote job container" task to various directories listed here:
"stdout_lines": [
"cd+++++++++ awx_8_5rlrajqc/",
"<f+++++++++ awx_8_5rlrajqc/.rsync-filter",
"<f+++++++++ awx_8_5rlrajqc/inventory",
"cd+++++++++ awx_8_5rlrajqc/cp/",
"cd+++++++++ awx_8_5rlrajqc/env/",
"<f+++++++++ awx_8_5rlrajqc/env/cmdline",
"<f+++++++++ awx_8_5rlrajqc/env/envvars",
"<f+++++++++ awx_8_5rlrajqc/env/extravars",
"<f+++++++++ awx_8_5rlrajqc/env/passwords",
"<f+++++++++ awx_8_5rlrajqc/env/settings",
"cd+++++++++ awx_8_5rlrajqc/project/",
"cd+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/",
"cd+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/",
"cd+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/command",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/stdout",
"cd+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/fact_cache/",
"cd+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/1-81c4fbcb-cd0d-49c4-8287-e55b5e156a90.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/10-f9bcd7cb-96ae-4598-9ac2-0b3e4de79920.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/11-0396c660-a222-4429-8673-882e3e010c4c.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/12-544ab9cc-4412-4920-b513-6d2b87cf2913.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/13-a5ef728e-e6f7-4f30-b370-44d08f57ee8e.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/14-d6ca5e1b-d1e5-481c-836f-1c7c4ca34dea.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/15-a5786df1-8d64-4171-8e02-6a72954c8d01.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/16-b66493b5-f618-a0c9-a41a-000000000006.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/17-ac30680f-c33a-437c-9ace-6764948a1e9b.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/18-b66493b5-f618-a0c9-a41a-000000000008.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/19-a2238757-9276-44f1-ab66-8accb1b5ca01.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/2-6a5df8d1-9b5e-46b5-ae07-c4e33eafc018.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/20-ebcff29a-e19b-466b-8d68-1126aa41c933.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/21-b66493b5-f618-a0c9-a41a-000000000009.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/22-1ed999f5-2867-4f32-a086-fb61bf3bf434.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/23-bd56c278-92e9-4c18-94ad-b27f0c1760fb.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/24-0d0c76cc-7342-4875-bca9-e4280314955a.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/25-9b729a26-88d7-4eba-8876-f460d58a6692.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/26-1b4aebbb-658a-4062-bf28-0d7de2917fb3.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/27-40a1816e-ba9f-45f8-95c8-f56304480934.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/28-c7de8856-4bb4-4696-ae44-83d4e8dd887c.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/29-5704e339-935f-46d4-81f5-bad2176aa921.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/3-539216a8-0436-43bf-ab18-9843149a1e0f.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/4-b0c741ce-2eeb-409a-bebf-1bf68d940385.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/5-6d48417a-3313-479d-a84f-bfeab0fc2f13.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/6-d085bc0f-f3d1-4aed-812d-82ff75a35a11.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/7-230643c1-d7c8-43de-bafa-87e32cc68389.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/8-c9cdd591-3b7d-4f31-b1ce-70fbcbdb3712.json",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/job_events/9-43a5b1ea-7144-4e62-aecd-96deb3600aca.json",
"cd+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/env/",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/env/envvars",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/env/extravars",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/env/settings",
"cd+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/inventory/",
"<f+++++++++ awx_8_5rlrajqc/run_isolated.ymlloi9lk56/inventory/hochanged: [awx-job-8] => {
Here we see there are actually 2 directories:
/tmp/<job-id><suffix>/*/tmp/<job-id><suffix>/<playbook-name><suffix2>/artifacts/*Once the job is launched into the awx-job-8 Pod in container awx-job-8 it runs ansible-runner start /tmp/awx_8_5rlrajqc -m ping -a '' -i 8 (my expected ad-hoc command), see output from awx-task:
changed: [awx-job-8] => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": true,
"cmd": [
"ansible-runner",
"start",
"/tmp/awx_8_5rlrajqc",
"-m",
"ping",
"-a",
"",
"-i",
"8"
],
"delta": "0:00:01.007437",
"end": "2020-12-18 15:07:53.553295",
"invocation": {
"module_args": {
"_raw_params": "ansible-runner start /tmp/awx_8_5rlrajqc -m ping -a '' -i 8",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"rc": 0,
"start": "2020-12-18 15:07:52.545858",
"stderr": "",
"stderr_lines": [],
"stdout": "",
"stdout_lines": []
}
The problem comes during the check_isolated.yml playbook. Here's the full output:
2020-12-18 15:08:25,148 DEBUG awx.isolated.manager Checking on isolated job 8 with `check_isolated.yml`.
2020-12-18 15:08:33,635 INFO awx.isolated.manager.playbooks ansible-playbook 2.9.15
config file = /etc/ansible/ansible.cfg
configured module search path = ['/var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/plugins/isolated']
ansible python module location = /usr/lib/python3.6/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 3.6.8 (default, Aug 24 2020, 17:57:11) [GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]
Using /etc/ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /tmp/awx_8_5rlrajqc/check_isolated.ymlrzokm6du/inventory/hosts.json as it did not pass its verify_file() method
script declined parsing /tmp/awx_8_5rlrajqc/check_isolated.ymlrzokm6du/inventory/hosts.json as it did not pass its verify_file() method
auto declined parsing /tmp/awx_8_5rlrajqc/check_isolated.ymlrzokm6du/inventory/hosts.json as it did not pass its verify_file() method
Parsed /tmp/awx_8_5rlrajqc/check_isolated.ymlrzokm6du/inventory/hosts.json inventory source with yaml plugin
Loading callback plugin awx_display of type stdout, v2.0 from /var/lib/awx/venv/awx/lib/python3.6/site-packages/ansible_runner/callbacks/awx_display.py
PLAYBOOK: check_isolated.yml ***************************************************
Positional arguments: check_isolated.yml
verbosity: 4
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
inventory: ('/tmp/awx_8_5rlrajqc/check_isolated.ymlrzokm6du/inventory/hosts.json',)
extra_vars: ('@/tmp/awx_8_5rlrajqc/check_isolated.ymlrzokm6du/env/extravars',)
forks: 5
1 plays in check_isolated.yml
PLAY [Poll for status of active job.] ******************************************
META: ran handlers
TASK [Determine if daemon process is alive.] ***********************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:13
<awx-job-8> ESTABLISH kubectl CONNECTION
<awx-job-8> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-8', '--', '/bin/sh', '-c', "/bin/sh -c 'echo ~ && sleep 0'"]
<awx-job-8> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-8', '--', '/bin/sh', '-c', '/bin/sh -c \'( umask 77 && mkdir -p "` echo /runner/.ansible/tmp `"&& mkdir "` echo /runner/.ansible/tmp/ansible-tmp-1608304107.185417-424860-109115722862788 `" && echo ansible-tmp-1608304107.185417-424860-109115722862788="` echo /runner/.ansible/tmp/ansible-tmp-1608304107.185417-424860-109115722862788 `" ) && sleep 0\'']
<awx-job-8> Attempting python interpreter discovery
<awx-job-8> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-8', '--', '/bin/sh', '-c', '/bin/sh -c \'echo PLATFORM; uname; echo FOUND; command -v \'"\'"\'/usr/bin/python\'"\'"\'; command -v \'"\'"\'python3.7\'"\'"\'; command -v \'"\'"\'python3.6\'"\'"\'; command -v \'"\'"\'python3.5\'"\'"\'; command -v \'"\'"\'python2.7\'"\'"\'; command -v \'"\'"\'python2.6\'"\'"\'; command -v \'"\'"\'/usr/libexec/platform-python\'"\'"\'; command -v \'"\'"\'/usr/bin/python3\'"\'"\'; command -v \'"\'"\'python\'"\'"\'; echo ENDFOUND && sleep 0\'']
<awx-job-8> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-8', '--', '/bin/sh', '-c', "/bin/sh -c '/usr/bin/python && sleep 0'"]
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<awx-job-8> PUT /var/lib/awx/.ansible/tmp/ansible-local-4248548x7b4jg9/tmpeq6p9zxu TO /runner/.ansible/tmp/ansible-tmp-1608304107.185417-424860-109115722862788/AnsiballZ_command.py
<awx-job-8> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-8', '--', '/bin/sh', '-c', "/bin/sh -c 'chmod u+x /runner/.ansible/tmp/ansible-tmp-1608304107.185417-424860-109115722862788/ /runner/.ansible/tmp/ansible-tmp-1608304107.185417-424860-109115722862788/AnsiballZ_command.py && sleep 0'"]
<awx-job-8> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-8', '--', '/bin/sh', '-c', "/bin/sh -c '/usr/bin/python /runner/.ansible/tmp/ansible-tmp-1608304107.185417-424860-109115722862788/AnsiballZ_command.py && sleep 0'"]
<awx-job-8> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-8', '--', '/bin/sh', '-c', "/bin/sh -c 'rm -f -r /runner/.ansible/tmp/ansible-tmp-1608304107.185417-424860-109115722862788/ > /dev/null 2>&1 && sleep 0'"]
[DEPRECATION WARNING]: Distribution centos 8.1.1911 on host awx-job-8 should
use /usr/libexec/platform-python, but is using /usr/bin/python for backward
compatibility with prior Ansible releases. A future Ansible release will
default to using the discovered platform python for this host. See https://docs
.ansible.com/ansible/2.9/reference_appendices/interpreter_discovery.html for
more information. This feature will be removed in version 2.12. Deprecation
warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.
fatal: [awx-job-8]: FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": true,
"cmd": "ansible-runner is-alive /tmp/awx_8_5rlrajqc",
"delta": "0:00:00.512402",
"end": "2020-12-18 15:08:30.890604",
"invocation": {
"module_args": {
"_raw_params": "ansible-runner is-alive /tmp/awx_8_5rlrajqc",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 1,
"start": "2020-12-18 15:08:30.378202",
"stderr": "",
"stderr_lines": [],
"stdout": "",
"stdout_lines": []
}
...ignoring
TASK [Copy artifacts from the isolated host.] **********************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:18
skipping: [awx-job-8] => {
"changed": false,
"skip_reason": "Conditional result was False"
}
TASK [Copy daemon log from the isolated host] **********************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:27
skipping: [awx-job-8] => {
"changed": false,
"skip_reason": "Conditional result was False"
}
TASK [Copy artifacts from pod] *************************************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:34
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: awx
<localhost> EXEC /bin/sh -c 'echo ~awx && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /var/lib/awx/.ansible/tmp `"&& mkdir "` echo /var/lib/awx/.ansible/tmp/ansible-tmp-1608304112.3415-425018-255686707266361 `" && echo ansible-tmp-1608304112.3415-425018-255686707266361="` echo /var/lib/awx/.ansible/tmp/ansible-tmp-1608304112.3415-425018-255686707266361 `" ) && sleep 0'
Using module file /var/lib/awx/vendor/awx_ansible_collections/ansible_collections/ansible/posix/plugins/modules/synchronize.py
<localhost> PUT /var/lib/awx/.ansible/tmp/ansible-local-4248548x7b4jg9/tmpid6kpo1c TO /var/lib/awx/.ansible/tmp/ansible-tmp-1608304112.3415-425018-255686707266361/AnsiballZ_synchronize.py
<localhost> EXEC /bin/sh -c 'chmod u+x /var/lib/awx/.ansible/tmp/ansible-tmp-1608304112.3415-425018-255686707266361/ /var/lib/awx/.ansible/tmp/ansible-tmp-1608304112.3415-425018-255686707266361/AnsiballZ_synchronize.py && sleep 0'
<localhost> EXEC /bin/sh -c 'RSH='"'"'oc rsh --config=/tmp/awx_8_5rlrajqc/.kubeconfigj_evcfq8'"'"' /usr/bin/python3.6 /var/lib/awx/.ansible/tmp/ansible-tmp-1608304112.3415-425018-255686707266361/AnsiballZ_synchronize.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /var/lib/awx/.ansible/tmp/ansible-tmp-1608304112.3415-425018-255686707266361/ > /dev/null 2>&1 && sleep 0'
fatal: [awx-job-8]: FAILED! => {
"changed": false,
"cmd": "/usr/bin/rsync --delay-updates -F --compress --delete-after --archive --blocking-io --rsh=$RSH --out-format=<<CHANGED>>%i %n%L awx-job-8:/tmp/awx_8_5rlrajqc/artifacts/ /tmp/awx_8_5rlrajqc/artifacts/",
"invocation": {
"module_args": {
"_local_rsync_password": null,
"_local_rsync_path": "rsync",
"_substitute_controller": false,
"archive": true,
"checksum": false,
"compress": true,
"copy_links": false,
"delete": true,
"dest": "/tmp/awx_8_5rlrajqc/artifacts/",
"dest_port": null,
"dirs": false,
"existing_only": false,
"group": null,
"link_dest": null,
"links": null,
"mode": "pull",
"owner": null,
"partial": false,
"perms": null,
"private_key": null,
"recursive": true,
"rsync_opts": [
"--blocking-io",
"--rsh=$RSH"
],
"rsync_path": null,
"rsync_timeout": 0,
"set_remote_user": false,
"src": "awx-job-8:/tmp/awx_8_5rlrajqc/artifacts/",
"ssh_args": null,
"times": null,
"verify_host": false
}
},
"msg": "rsync: change_dir \"/tmp/awx_8_5rlrajqc/artifacts\" failed: No such file or directory (2)\ncommand terminated with exit code 23\nrsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1659) [Receiver=3.1.3]\nrsync: [Receiver] write error: Broken pipe (32)\n",
"rc": 23
}
PLAY RECAP *********************************************************************
awx-job-8 : ok=1 changed=1 unreachable=0 failed=1 skipped=2 rescued=0 ignored=1
The important bits:
stdout nor stderrconsole
bash-4.4# ansible-runner --debug is-alive /tmp/awx_8_5rlrajqc
starting debug logging
bash-4.4# ansible-runner --debug -vvvvv is-alive /tmp/awx_8_5rlrajqc
starting debug logging
bash-4.4# ansible-runner --version
1.4.6
/tmp/awx_8_5rlrajqc/artifacts does not exist in the container. This is TRUE!console
bash-4.4# ls -la /tmp/awx_8_5rlrajqc/
total 32
drwx------ 6 1000 root 4096 Dec 18 15:07 .
drwxrwxrwt 1 root root 4096 Dec 18 16:05 ..
drwx------ 2 1000 root 4096 Dec 18 15:07 cp
-rw------- 1 root root 0 Dec 18 15:07 daemon.log
drwx------ 2 1000 root 4096 Dec 18 15:07 env
-rwx------ 1 1000 root 222 Dec 18 15:07 inventory
drwxr-xr-x 2 1000 root 4096 Dec 18 15:07 project
-r-------- 1 1000 root 104 Dec 18 15:07 .rsync-filter
drwx------ 5 1000 root 4096 Dec 18 15:07 run_isolated.ymlloi9lk56
bash-4.4# ls -la /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/
total 20
drwx------ 5 1000 root 4096 Dec 18 15:07 .
drwx------ 6 1000 root 4096 Dec 18 15:07 ..
drwx------ 3 1000 root 4096 Dec 18 15:07 artifacts
drwx------ 2 1000 root 4096 Dec 18 15:07 env
drwx------ 2 1000 root 4096 Dec 18 15:07 inventory
bash-4.4# ls -la /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/
total 12
drwx------ 3 1000 root 4096 Dec 18 15:07 .
drwx------ 5 1000 root 4096 Dec 18 15:07 ..
drwx------ 4 1000 root 4096 Dec 18 15:07 ed84a7af-41dd-424e-a0e2-c52f1fdf1e86
bash-4.4# ls -la /tmp/awx_8_5rlrajqc/run_isolated.ymlloi9lk56/artifacts/ed84a7af-41dd-424e-a0e2-c52f1fdf1e86/
total 28
drwx------ 4 1000 root 4096 Dec 18 15:07 .
drwx------ 3 1000 root 4096 Dec 18 15:07 ..
-rw------- 1 1000 root 5642 Dec 18 15:07 command
drwxr-xr-x 2 1000 root 4096 Dec 18 15:07 fact_cache
drwx------ 2 1000 root 4096 Dec 18 15:07 job_events
-rw------- 1 1000 root 3364 Dec 18 15:07 stdout
/tmp/<job-id><suffix>/artifacts/* does not exist <-- I think this is a bug, but I may be wrong/tmp<job-id><suffix>/<playbook-name><suffix2>/artifacts/* does exist <-- maybe this is the correct location?I've tracked down the following sources of code that could potentially be useful:
run_management_palybook happens and the private_data_dir is used to create a _new_ temp data dir with prefix <playbook-name>. This seems to match the 2nd dir /tmp<job-id><suffix>/<playbook-name><suffix2>/artifacts directory location.isolated_manager_instance.run() is called with the original source of private_data_dir supplied.private_data_dir is initially created: https://github.com/ansible/awx/blob/devel/awx/main/tasks.py#L890 (although I think this is the dir inside of awx-task not inside the awx-job-8 container)check_isolated.yml playbook. This playbooks seems blissfully unaware of the true private_data_dir and just executes on the given {{src}} variable (which I have been unable to track down, it must come from https://github.com/ansible/awx/blob/devel/awx/main/isolated/manager.py#L223 or something in this execution path)run_isolated.yml is started the same way as check_isolated.yml (https://github.com/ansible/awx/blob/devel/awx/main/isolated/manager.py#L180) so I would assume that private_data_dir would be the same... but I guess not. Perhaps this is a problem with the initial rsync or with ansible-runner command?One thing I've noticed is that there is no original artifacts/ directory for rsync to copy over from the awx-task container's /tmp/<job-id><suffix>/ directory:
bash-4.4$ ls -la
total 548
drwx------ 132 awx root 12288 Dec 18 16:29 .
drwxrwxrwt 1 root root 4096 Dec 18 16:29 ..
drwx------ 5 awx root 4096 Dec 18 15:41 check_isolated.yml00xj0xu0
drwx------ 5 awx root 4096 Dec 18 15:55 check_isolated.yml04w4_clu
... # omitted 30+ lines of check_isolated.yml<suffix> directories
drwx------ 2 awx root 4096 Dec 18 15:07 cp
drwx------ 2 awx root 4096 Dec 18 15:07 env
-rwx------ 1 awx root 222 Dec 18 15:07 inventory
-rwx------ 1 awx root 2761 Dec 18 15:07 .kubeconfigj_evcfq8
drwxr-xr-x 2 awx root 4096 Dec 18 15:07 project
-r-------- 1 awx root 104 Dec 18 15:07 .rsync-filter
drwx------ 5 awx root 4096 Dec 18 15:07 run_isolated.ymlloi9lk56
Running the initial ad-hoc ansible-runner command printed in the output of run_isolated.yml does not print anything and does not seem to spawn any processes:
This is run in the awx-job-8 container in the /tmp/<job-id><suffix> directory.
bash-4.4# ansible-runner --debug -vvvvv start /tmp/awx_8_5rlrajqc -m ping -a '' -i 8
starting debug logging
ebash-4.4# echo $?
0
bash-4.4# ps aux
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 0.0 4340 792 pts/0 Ss 15:07 0:00 tini -- sleep infinity
root 12 0.0 0.0 25956 1384 pts/0 S+ 15:07 0:00 /usr/bin/coreutils --coreutils-prog-shebang=sleep /usr/bin/sleep infinity
root 331 0.0 0.0 15052 3880 pts/1 Ss 15:10 0:00 bash
root 9144 0.0 0.0 46884 3724 pts/1 R+ 16:36 0:00 ps aux
I'm confirming that we're using the correct version of ansible-runner in the awx-job-8 container running the ansible/ansible-runner:latest image:
See: https://github.com/ansible/awx/blob/devel/requirements/requirements.txt#L4
Version: 1.4.6
Output from the awx-job-8 container:
bash-4.4# ansible-runner --version
1.4.6
Today I tried manually running ansible-runner run instead of ansible-runner start with the same args inside the awx-job-8 container while the job was still "running" and I got the following output:
bash-4.4# ansible-runner run /tmp/awx_10_ewx2h344 -m ping -a '' -i 10 --debug -vvv
starting debug logging
file path is /tmp/awx_10_ewx2h344/env/passwords
cache miss, attempting to load file from disk: /tmp/awx_10_ewx2h344/env/passwords
file path is /tmp/awx_10_ewx2h344/env/envvars
cache miss, attempting to load file from disk: /tmp/awx_10_ewx2h344/env/envvars
file path is /tmp/awx_10_ewx2h344/env/settings
cache miss, attempting to load file from disk: /tmp/awx_10_ewx2h344/env/settings
file path is /tmp/awx_10_ewx2h344/env/ssh_key
cache miss, attempting to load file from disk: /tmp/awx_10_ewx2h344/env/ssh_key
specified path does not exist /tmp/awx_10_ewx2h344/env/ssh_key
Not loading ssh key
file path is /tmp/awx_10_ewx2h344/args
cache miss, attempting to load file from disk: /tmp/awx_10_ewx2h344/args
specified path does not exist /tmp/awx_10_ewx2h344/args
file path is /tmp/awx_10_ewx2h344/env/cmdline
cache miss, attempting to load file from disk: /tmp/awx_10_ewx2h344/env/cmdline
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/c1db96e5-1566-47ae-a543-a8361577e66b-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/0f9791a8-f608-4058-8991-4bd69a143257-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/5502c9e2-d2b3-4cf8-8aeb-3dd15285c3a6-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/9b01deb6-7f8d-4d0c-9b31-4c0daa696e29-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/0e76d8bf-3ddd-4709-9b4d-5703a65c7974-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/e1079f1b-1120-45c6-bdd7-8a156e1acf29-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/561c7351-202a-4d9a-a627-fc23498a371d-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/fb732a0c-1f8f-4efb-81bf-66fb57d3628c-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/44958c36-75f0-48d9-b750-fe514f69abff-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/d8bf934a-0fff-4ed3-9cc7-f6c041813de6-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/d436d6b2-3591-4148-8e11-ff5e4e557bf7-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/9c32d463-b064-4ac5-a325-f0a3d9bfe344-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/4eea1f5f-f541-4215-8174-481151b4310a-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/573eba8e-6850-484a-b6a2-558678586410-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/569d2db1-6827-4352-936c-19cdcd8106e3-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/bc3160d3-c22e-4e78-82e5-e69b8ddbc6c1-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/ceadb52d-5f98-40ac-b405-a28749d5945d-partial.json
Failed to open ansible stdout callback plugin partial data file /tmp/awx_10_ewx2h344/artifacts/10/job_events/4015a98a-e6e7-40a5-8d3a-f7d0517565e6-partial.json
bash-4.4# command terminated with exit code 137
So it looks like ansible-runner _does_ work with run but not with start. Then the /tmp/<job-id><suffix>/artifacts directory was populated, the "Copy artifacts from pod" Task worked in the awx-task container, and then the clean_isolated.yml playbook was run and the awx-job-8 Pod was cleaned up (that's the exit code 137 in the output above).
Here's what awx-task showed for the successful check_isolated.yml:
2020-12-21 14:42:41,063 DEBUG awx.isolated.manager Checking on isolated job 10 with `check_isolated.yml`.
2020-12-21 14:42:52,253 INFO awx.isolated.manager.playbooks ansible-playbook 2.9.15
config file = /etc/ansible/ansible.cfg
configured module search path = ['/var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/plugins/isolated']
ansible python module location = /usr/lib/python3.6/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 3.6.8 (default, Aug 24 2020, 17:57:11) [GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]
Using /etc/ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /tmp/awx_10_ewx2h344/check_isolated.ymlg3efwlv0/inventory/hosts.json as it did not pass its verify_file() method
script declined parsing /tmp/awx_10_ewx2h344/check_isolated.ymlg3efwlv0/inventory/hosts.json as it did not pass its verify_file() method
auto declined parsing /tmp/awx_10_ewx2h344/check_isolated.ymlg3efwlv0/inventory/hosts.json as it did not pass its verify_file() method
Parsed /tmp/awx_10_ewx2h344/check_isolated.ymlg3efwlv0/inventory/hosts.json inventory source with yaml plugin
Loading callback plugin awx_display of type stdout, v2.0 from /var/lib/awx/venv/awx/lib/python3.6/site-packages/ansible_runner/callbacks/awx_display.py
PLAYBOOK: check_isolated.yml ***************************************************
Positional arguments: check_isolated.yml
verbosity: 4
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
inventory: ('/tmp/awx_10_ewx2h344/check_isolated.ymlg3efwlv0/inventory/hosts.json',)
extra_vars: ('@/tmp/awx_10_ewx2h344/check_isolated.ymlg3efwlv0/env/extravars',)
forks: 5
1 plays in check_isolated.yml
PLAY [Poll for status of active job.] ******************************************
META: ran handlers
TASK [Determine if daemon process is alive.] ***********************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:13
<awx-job-10> ESTABLISH kubectl CONNECTION
<awx-job-10> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-10', '--', '/bin/sh', '-c', "/bin/sh -c 'echo ~ && sleep 0'"]
<awx-job-10> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-10', '--', '/bin/sh', '-c', '/bin/sh -c \'( umask 77 && mkdir -p "` echo /runner/.ansible/tmp `"&& mkdir "` echo /runner/.ansible/tmp/ansible-tmp-1608561763.26912-463289-166531840327044 `" && echo ansible-tmp-1608561763.26912-463289-166531840327044="` echo /runner/.ansible/tmp/ansible-tmp-1608561763.26912-463289-166531840327044 `" ) && sleep 0\'']
<awx-job-10> Attempting python interpreter discovery
<awx-job-10> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-10', '--', '/bin/sh', '-c', '/bin/sh -c \'echo PLATFORM; uname; echo FOUND; command -v \'"\'"\'/usr/bin/python\'"\'"\'; command -v \'"\'"\'python3.7\'"\'"\'; command -v \'"\'"\'python3.6\'"\'"\'; command -v \'"\'"\'python3.5\'"\'"\'; command -v \'"\'"\'python2.7\'"\'"\'; command -v \'"\'"\'python2.6\'"\'"\'; command -v \'"\'"\'/usr/libexec/platform-python\'"\'"\'; command -v \'"\'"\'/usr/bin/python3\'"\'"\'; command -v \'"\'"\'python\'"\'"\'; echo ENDFOUND && sleep 0\'']
<awx-job-10> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-10', '--', '/bin/sh', '-c', "/bin/sh -c '/usr/bin/python && sleep 0'"]
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<awx-job-10> PUT /var/lib/awx/.ansible/tmp/ansible-local-463283p0k6kjv6/tmpk5femok_ TO /runner/.ansible/tmp/ansible-tmp-1608561763.26912-463289-166531840327044/AnsiballZ_command.py
<awx-job-10> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-10', '--', '/bin/sh', '-c', "/bin/sh -c 'chmod u+x /runner/.ansible/tmp/ansible-tmp-1608561763.26912-463289-166531840327044/ /runner/.ansible/tmp/ansible-tmp-1608561763.26912-463289-166531840327044/AnsiballZ_command.py && sleep 0'"]
<awx-job-10> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-10', '--', '/bin/sh', '-c', "/bin/sh -c '/usr/bin/python /runner/.ansible/tmp/ansible-tmp-1608561763.26912-463289-166531840327044/AnsiballZ_command.py && sleep 0'"]
<awx-job-10> EXEC ['/usr/bin/kubectl', 'exec', '-i', 'awx-job-10', '--', '/bin/sh', '-c', "/bin/sh -c 'rm -f -r /runner/.ansible/tmp/ansible-tmp-1608561763.26912-463289-166531840327044/ > /dev/null 2>&1 && sleep 0'"]
[DEPRECATION WARNING]: Distribution centos 8.1.1911 on host awx-job-10 should
use /usr/libexec/platform-python, but is using /usr/bin/python for backward
compatibility with prior Ansible releases. A future Ansible release will
default to using the discovered platform python for this host. See https://docs
.ansible.com/ansible/2.9/reference_appendices/interpreter_discovery.html for
more information. This feature will be removed in version 2.12. Deprecation
warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.
fatal: [awx-job-10]: FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": true,
"cmd": "ansible-runner is-alive /tmp/awx_10_ewx2h344",
"delta": "0:00:00.505986",
"end": "2020-12-21 14:42:47.538571",
"invocation": {
"module_args": {
"_raw_params": "ansible-runner is-alive /tmp/awx_10_ewx2h344",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 1,
"start": "2020-12-21 14:42:47.032585",
"stderr": "",
"stderr_lines": [],
"stdout": "",
"stdout_lines": []
}
...ignoring
TASK [Copy artifacts from the isolated host.] **********************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:18
skipping: [awx-job-10] => {
"changed": false,
"skip_reason": "Conditional result was False"
}
TASK [Copy daemon log from the isolated host] **********************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:27
skipping: [awx-job-10] => {
"changed": false,
"skip_reason": "Conditional result was False"
}
TASK [Copy artifacts from pod] *************************************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:34
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: awx
<localhost> EXEC /bin/sh -c 'echo ~awx && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /var/lib/awx/.ansible/tmp `"&& mkdir "` echo /var/lib/awx/.ansible/tmp/ansible-tmp-1608561769.2436752-463464-76708829730735 `" && echo ansible-tmp-1608561769.2436752-463464-76708829730735="` echo /var/lib/awx/.ansible/tmp/ansible-tmp-1608561769.2436752-463464-76708829730735 `" ) && sleep 0'
Using module file /var/lib/awx/vendor/awx_ansible_collections/ansible_collections/ansible/posix/plugins/modules/synchronize.py
<localhost> PUT /var/lib/awx/.ansible/tmp/ansible-local-463283p0k6kjv6/tmpnt8khgdj TO /var/lib/awx/.ansible/tmp/ansible-tmp-1608561769.2436752-463464-76708829730735/AnsiballZ_synchronize.py
<localhost> EXEC /bin/sh -c 'chmod u+x /var/lib/awx/.ansible/tmp/ansible-tmp-1608561769.2436752-463464-76708829730735/ /var/lib/awx/.ansible/tmp/ansible-tmp-1608561769.2436752-463464-76708829730735/AnsiballZ_synchronize.py && sleep 0'
<localhost> EXEC /bin/sh -c 'RSH='"'"'oc rsh --config=/tmp/awx_10_ewx2h344/.kubeconfig_qai0nct'"'"' /usr/bin/python3.6 /var/lib/awx/.ansible/tmp/ansible-tmp-1608561769.2436752-463464-76708829730735/AnsiballZ_synchronize.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /var/lib/awx/.ansible/tmp/ansible-tmp-1608561769.2436752-463464-76708829730735/ > /dev/null 2>&1 && sleep 0'
.json\n>f+++++++++ 10/job_events/13-ecd15715-bf81-4c25-8877-b869f95e58a2.json\n>f+++++++++ 10/job_events/14-640dfd75-f0cb-4f72-ad5c-6b171ab86b8c.json\n>f+++++++++ 10/job_events/15-6c8ee9d5-9dd4-436b-8dcc-00013791faa7.json\n>f+++++++++ 10/job_events/16-e753e5c6-4648-4cf1-9f6a-c4f26dbd3bf1.json\n>f+++++++++ 10/job_events/17-bd26fd7a-255f-45a3-8083-07c7a51f28c6.json\n>f+++++++++ 10/job_events/18-f0d9f940-1b5b-4dcb-9b64-4c9c3925d091.json\n>f+++++++++ 10/job_events/19-cc2722a1-bc28-4e80-92e5-55e02d94bf2a.json\n>f+++++++++ 10/job_events/2-0f9791a8-f608-4058-8991-4bd69a143257.json\n>f+++++++++ 10/job_events/20-01f11bc1-da1c-4548-a0e5-50c585602b59.json\n>f+++++++++ 10/job_events/21-a50dda5c-6a38-4c56-a424-694e1a27cc3b.json\n>f+++++++++ 10/job_events/22-f64f2181-d570-4218-a680-63c8793dbf6a.json\n>f+++++++++ 10/job_events/23-8cddb386-e5e0-43df-970b-dfc556b6ff02.json\n>f+++++++++ 10/job_events/24-2b8bc2cd-21f3-48f0-baaf-30437b8c9f62.json\n>f+++++++++ 10/job_events/25-e80d1d71-f1b6-4f95-8d91-609a9c1029f5.json\n>f+++++++++ 10/job_events/26-aa3cdbc2-21f1-4ed5-88f4-4076090217d3.json\n>f+++++++++ 10/job_events/27-b1ebe45b-efa1-49c2-849a-c3c6626910e8.json\n>f+++++++++ 10/job_events/28-747e6a62-91e7-4a3c-9223-923616bc434c.json\n>f+++++++++ 10/job_events/29-3df54aad-d63c-41a2-863c-029716896b5c.json\n>f+++++++++ 10/job_events/3-5502c9e2-d2b3-4cf8-8aeb-3dd15285c3a6.json\n>f+++++++++ 10/job_events/30-3a21f0fb-fff6-4867-b82b-c678d913a4e0.json\n>f+++++++++ 10/job_events/31-c93223fc-0bda-46ec-a7ac-faf4454f84d9.json\n>f+++++++++ 10/job_events/32-21e43499-18c1-4bcd-a514-5498364aaeea.json\n>f+++++++++ 10/job_events/33-f4c4f51e-0ef1-4d89-8246-403e61633b6d.json\n>f+++++++++ 10/job_events/34-167644bc-e460-489d-a0e1-5aab8baea45f.json\n>f+++++++++ 10/job_events/35-3a98a6b3-20b7-4fb2-9b70-497c5cb247c8.json\n>f+++++++++ 10/job_events/36-092ab2dd-093a-40f8-b66f-b59ef29d2c3e.json\n>f+++++++++ 10/job_events/37-0a4683f7-b300-451c-8052-20759f155280.json\n>f+++++++++ 10/job_events/38-caf9f8dc-dbfe-4b4a-9da5-9fec9ffba6b8.json\n>f+++++++++ 10/job_events/39-b9a2f43b-62db-4d10-a93a-8f5e4d6d592a.json\n>f+++++++++ 10/job_events/4-9b01deb6-7f8d-4d0c-9b31-4c0daa696e29.json\n>f+++++++++ 10/job_events/40-87a46845-606d-4de6-960e-854dbd271bf3.json\n>f+++++++++ 10/job_events/41-0d0b3c5c-4afb-48ac-80f4-57fe9013370c.json\n>f+++++++++ 10/job_events/42-e6448903-1aac-462c-bad4-9434bd6ee2dc.json\n>f+++++++++ 10/job_events/43-82174046-e37b-436a-8d10-61ec291284b4.json\n>f+++++++++ 10/job_events/44-e0e5637d-1422-4e56-8d23-414602238574.json\n>f+++++++++ 10/job_events/45-6b18194c-1511-4286-ab72-9b85f9701f78.json\n>f+++++++++ 10/job_events/46-da9d2d99-c132-4778-8fcc-f2259ad6c644.json\n>f+++++++++ 10/job_events/47-5dca7a03-7bbd-43af-a071-924695aca873.json\n>f+++++++++ 10/job_events/48-9c32d463-b064-4ac5-a325-f0a3d9bfe344.json\n>f+++++++++ 10/job_events/49-4eea1f5f-f541-4215-8174-481151b4310a.json\n>f+++++++++ 10/job_events/5-0e76d8bf-3ddd-4709-9b4d-5703a65c7974.json\n>f+++++++++ 10/job_events/50-573eba8e-6850-484a-b6a2-558678586410.json\n>f+++++++++ 10/job_events/51-569d2db1-6827-4352-936c-19cdcd8106e3.json\n>f+++++++++ 10/job_events/52-bc3160d3-c22e-4e78-82e5-e69b8ddbc6c1.json\n>f+++++++++ 10/job_events/53-ceadb52d-5f98-40ac-b405-a28749d5945d.json\n>f+++++++++ 10/job_events/54-4015a98a-e6e7-40a5-8d3a-f7d0517565e6.json\n>f+++++++++ 10/job_events/55-cb970ac3-7cbe-4e5d-83ce-0ca81e2464b5.json\n>f+++++++++ 10/job_events/56-16ea0bd8-9e98-4b7d-8f17-bb2315c1972d.json\n>f+++++++++ 10/job_events/57-d9fa5d0e-11b2-4db8-9485-623a7641b422.json\n>f+++++++++ 10/job_events/58-0dae698d-6b47-4346-a38d-e64d76034e96.json\n>f+++++++++ 10/job_events/6-e1079f1b-1120-45c6-bdd7-8a156e1acf29.json\n>f+++++++++ 10/job_events/7-561c7351-202a-4d9a-a627-fc23498a371d.json\n>f+++++++++ 10/job_events/8-fb732a0c-1f8f-4efb-81bf-66fb57d3628c.json\n>f+++++++++ 10/job_events/9-44958c36-75f0-48d9-b750-fe514f69abff.json\n",
"rc": 0,
"stdout_lines": [
"cd+++++++++ ./",
"cd+++++++++ 10/",
">f+++++++++ 10/command",
">f+++++++++ 10/rc",
">f+++++++++ 10/status",
">f+++++++++ 10/stdout",
"cd+++++++++ 10/fact_cache/",
">f+++++++++ 10/fact_cache/localhost",
"cd+++++++++ 10/job_events/",
">f+++++++++ 10/job_events/1-c1db96e5-1566-47ae-a543-a8361577e66b.json",
">f+++++++++ 10/job_events/10-d8bf934a-0fff-4ed3-9cc7-f6c041813de6.json",
">f+++++++++ 10/job_events/11-d436d6b2-3591-4148-8e11-ff5e4e557bf7.json",
">f+++++++++ 10/job_events/12-f1815fd9-d25f-421b-a5b2-b96ebc7a2cda.json",
">f+++++++++ 10/job_events/13-ecd15715-bf81-4c25-8877-b869f95e58a2.json",
">f+++++++++ 10/job_events/14-640dfd75-f0cb-4f72-ad5c-6b171ab86b8c.json",
">f+++++++++ 10/job_events/15-6c8ee9d5-9dd4-436b-8dcc-00013791faa7.json",
">f+++++++++ 10/job_events/16-e753e5c6-4648-4cf1-9f6a-c4f26dbd3bf1.json",
">f+++++++++ 10/job_events/17-bd26fd7a-255f-45a3-8083-07c7a51f28c6.json",
">f+++++++++ 10/job_events/18-f0d9f940-1b5b-4dcb-9b64-4c9c3925d091.json",
">f+++++++++ 10/job_events/19-cc2722a1-bc28-4e80-92e5-55e02d94bf2a.json",
">f+++++++++ 10/job_events/2-0f9791a8-f608-4058-8991-4bd69a143257.json",
">f+++++++++ 10/job_events/20-01f11bc1-da1c-4548-a0e5-50c585602b59.json",
">f+++++++++ 10/job_events/21-a50dda5c-6a38-4c56-a424-694e1a27cc3b.json",
">f+++++++++ 10/job_events/22-f64f2181-d570-4218-a680-63c8793dbf6a.json",
">f+++++++++ 10/job_events/23-8cddb386-e5e0-43df-970b-dfc556b6ff02.json",
">f+++++++++ 10/job_events/24-2b8bc2cd-21f3-48f0-baaf-30437b8c9f62.json",
">f+++++++++ 10/job_events/25-e80d1d71-f1b6-4f95-8d91-609a9c1029f5.json",
">f+++++++++ 10/job_events/26-aa3cdbc2-21f1-4ed5-88f4-4076090217d3.json",
">f+++++++++ 10/job_events/27-b1ebe45b-efa1-49c2-849a-c3c6626910e8.json",
">f+++++++++ 10/job_events/28-747e6a62-91e7-4a3c-9223-923616bc434c.json",
">f+++++++++ 10/job_events/29-3df54aad-d63c-41a2-863c-029716896b5c.json",
">f+++++++++ 10/job_events/3-5502c9e2-d2b3-4cf8-8aeb-3dd15285c3a6.json",
">f+++++++++ 10/job_events/30-3a21f0fb-fff6-4867-b82b-c678d913a4e0.json",
">f+++++++++ 10/job_events/31-c93223fc-0bda-46ec-a7ac-faf4454f84d9.json",
">f+++++++++ 10/job_events/32-21e43499-18c1-4bcd-a514-5498364aaeea.json",
">f+++++++++ 10/job_events/33-f4c4f51e-0ef1-4d89-8246-403e61633b6d.json",
">f+++++++++ 10/job_events/34-167644bc-e460-489d-a0e1-5aab8baea45f.json",
">f+++++++++ 10/job_events/35-3a98a6b3-20b7-4fb2-9b70-497c5cb247c8.json",
">f+++++++++ 10/job_events/36-092ab2dd-093a-40f8-b66f-b59ef29d2c3e.json",
">f+++++++++ 10/job_events/37-0a4683f7-b300-451c-8052-20759f155280.json",
">f+++++++++ 10/job_events/38-caf9f8dc-dbfe-4b4a-9da5-9fec9ffba6b8.json",
">f+++++++++ 10/job_events/39-b9a2f43b-62db-4d10-a93a-8f5e4d6d592a.json",
">f+++++++++ 10/job_events/4-9b01deb6-7f8d-4d0c-9b31-4c0daa696e29.json",
">f+++++++++ 10/job_events/40-87a46845-606d-4de6-960e-854dbd271bf3.json",
">f+++++++++ 10/job_events/41-0d0b3c5c-4afb-48ac-80f4-57fe9013370c.json",
">f+++++++++ 10/job_events/42-e6448903-1aac-462c-bad4-9434bd6ee2dc.json",
">f+++++++++ 10/job_events/43-82174046-e37b-436a-8d10-61ec291284b4.json",
">f+++++++++ 10/job_events/44-e0e5637d-1422-4e56-8d23-414602238574.json",
">f+++++++++ 10/job_events/45-6b18194c-1511-4286-ab72-9b85f9701f78.json",
">f+++++++++ 10/job_events/46-da9d2d99-c132-4778-8fcc-f2259ad6c644.json",
">f+++++++++ 10/job_events/47-5dca7a03-7bbd-43af-a071-924695aca873.json",
">f+++++++++ 10/job_events/48-9c32d463-b064-4ac5-a325-f0a3d9bfe344.json",
">f+++++++++ 10/job_events/49-4eea1f5f-f541-4215-8174-481151b4310a.json",
">f+++++++++ 10/job_events/5-0e76d8bf-3ddd-4709-9b4d-5703a65c7974.json",
">f+++++++++ 10/job_changed: [awx-job-10] => {
"changed": true,
"cmd": "/usr/bin/rsync --delay-updates -F --compress --delete-after --archive --blocking-io --rsh=$RSH --out-format=<<CHANGED>>%i %n%L awx-job-10:/tmp/awx_10_ewx2h344/artifacts/ /tmp/awx_10_ewx2h344/artifacts/",
"invocation": {
"module_args": {
"_local_rsync_password": null,
"_local_rsync_path": "rsync",
"_substitute_controller": false,
"archive": true,
"checksum": false,
"compress": true,
"copy_links": false,
"delete": true,
"dest": "/tmp/awx_10_ewx2h344/artifacts/",
"dest_port": null,
"dirs": false,
"existing_only": false,
"group": null,
"link_dest": null,
"links": null,
"mode": "pull",
"owner": null,
"partial": false,
"perms": null,
"private_key": null,
"recursive": true,
"rsync_opts": [
"--blocking-io",
"--rsh=$RSH"
],
"rsync_path": null,
"rsync_timeout": 0,
"set_remote_user": false,
"src": "awx-job-10:/tmp/awx_10_ewx2h344/artifacts/",
"ssh_args": null,
"times": null,
"verify_host": false
}
},
"msg": "cd+++++++++ ./\ncd+++++++++ 10/\n>f+++++++++ 10/command\n>f+++++++++ 10/rc\n>f+++++++++ 10/status\n>f+++++++++ 10/stdout\ncd+++++++++ 10/fact_cache/\n>f+++++++++ 10/fact_cache/localhost\ncd+++++++++ 10/job_events/\n>f+++++++++ 10/job_events/1-c1db96e5-1566-47ae-a543-a8361577e66b.json\n>f+++++++++ 10/job_events/10-d8bf934a-0fff-4ed3-9cc7-f6c041813de6.json\n>f+++++++++ 10/job_events/11-d436d6b2-3591-4148-8e11-ff5e4e557bf7.json\n>f+++++++++ 10/job_events/12-f1815fd9-d25f-421b-a5b2-b96ebc7a2cda.json\n>f+++++++++ 10/job_events/13-ecd15715-bf81-4c25-8877-b869f95e58a2.json\n>f+++++++++ 10/job_events/14-640dfd75-f0cb-4f72-ad5c-6b171ab86b8c.json\n>f+++++++++ 10/job_events/15-6c8ee9d5-9dd4-436b-8dcc-00013791faa7.json\n>f+++++++++ 10/job_events/16-e753e5c6-4648-4cf1-9f6a-c4f26dbd3bf1.json\n>f+++++++++ 10/job_events/17-bd26fd7a-255f-45a3-8083-07c7a51f28c6.json\n>f+++++++++ 10/job_events/18-f0d9f940-1b5b-4dcb-9b64-4c9c3925d091.json\n>f+++++++++ 10/job_events/19-cc2722a1-bc28-4e80-92e5-55e02d94bf2a.json\n>f+++++++++ 10/job_events/2-0f9791a8-f608-4058-8991-4bd69a143257.json\n>f+++++++++ 10/job_events/20-01f11bc1-da1c-4548-a0e5-50c585602b59.json\n>f+++++++++ 10/job_events/21-a50dda5c-6a38-4c56-a424-694e1a27cc3b.json\n>f+++++++++ 10/job_events/22-f64f2181-d570-4218-a680-63c8793dbf6a.json\n>f+++++++++ 10/job_events/23-8cddb386-e5e0-43df-970b-dfc556b6ff02.json\n>f+++++++++ 10/job_events/24-2b8bc2cd-21f3-48f0-baaf-30437b8c9f62.json\n>f+++++++++ 10/job_events/25-e80d1d71-f1b6-4f95-8d91-609a9c1029f5.json\n>f+++++++++ 10/job_events/26-aa3cdbc2-21f1-4ed5-88f4-4076090217d3.json\n>f+++++++++ 10/job_events/27-b1ebe45b-efa1-49c2-849a-c3c6626910e8.json\n>f+++++++++ 10/job_events/28-747e6a62-91e7-4a3c-9223-923616bc434c.json\n>f+++++++++ 10/job_events/29-3df54aad-d63c-41a2-863c-029716896b5c.json\n>f+++++++++ 10/job_events/3-5502c9e2-d2b3-4cf8-8aeb-3dd15285c3a6.json\n>f+++++++++ 10/job_events/30-3a21f0fb-fff6-4867-b82b-c678d913a4e0.json\n>f+++++++++ 10/job_events/31-c93223fc-0bda-46ec-a7ac-faf4454f84d9.json\n>f+++++++++ 10/job_events/32-21e43499-18c1-4bcd-a514-5498364aaeea.json\n>f+++++++++ 10/job_events/33-f4c4f51e-0ef1-4d89-8246-403e61633b6d.json\n>f+++++++++ 10/job_events/34-167644bc-e460-489d-a0e1-5aab8baea45f.json\n>f+++++++++ 10/job_events/35-3a98a6b3-20b7-4fb2-9b70-497c5cb247c8.json\n>f+++++++++ 10/job_events/36-092ab2dd-093a-40f8-b66f-b59ef29d2c3e.json\n>f+++++++++ 10/job_events/37-0a4683f7-b300-451c-8052-20759f155280.json\n>f+++++++++ 10/job_events/38-caf9f8dc-dbfe-4b4a-9da5-9fec9ffba6b8.json\n>f+++++++++ 10/job_events/39-b9a2f43b-62db-4d10-a93a-8f5e4d6d592a.json\n>f+++++++++ 10/job_events/4-9b01deb6-7f8d-4d0c-9b31-4c0daa696e29.json\n>f+++++++++ 10/job_events/40-87a46845-606d-4de6-960e-854dbd271bf3.json\n>f+++++++++ 10/job_events/41-0d0b3c5c-4afb-48ac-80f4-57fe9013370c.json\n>f+++++++++ 10/job_events/42-e6448903-1aac-462c-bad4-9434bd6ee2dc.json\n>f+++++++++ 10/job_events/43-82174046-e37b-436a-8d10-61ec291284b4.json\n>f+++++++++ 10/job_events/44-e0e5637d-1422-4e56-8d23-414602238574.json\n>f+++++++++ 10/job_events/45-6b18194c-1511-4286-ab72-9b85f9701f78.json\n>f+++++++++ 10/job_events/46-da9d2d99-c132-4778-8fcc-f2259ad6c644.json\n>f+++++++++ 10/job_events/47-5dca7a03-7bbd-43af-a071-924695aca873.json\n>f+++++++++ 10/job_events/48-9c32d463-b064-4ac5-a325-f0a3d9bfe344.json\n>f+++++++++ 10/job_events/49-4eea1f5f-f541-4215-8174-481151b4310a.json\n>f+++++++++ 10/job_events/5-0e76d8bf-3ddd-4709-9b4d-5703a65c7974.json\n>f+++++++++ 10/job_events/50-573eba8e-6850-484a-b6a2-558678586410.json\n>f+++++++++ 10/job_events/51-569d2db1-6827-4352-936c-19cdcd8106e3.json\n>f+++++++++ 10/job_events/52-bc3160d3-c22e-4e78-82e5-e69b8ddbc6c1.json\n>f+++++++++ 10/job_events/53-ceadb52d-5f98-40ac-b405-a28749d5945d.json\n>f+++++++++ 10/job_events/54-4015a98a-e6e7-40a5-8d3a-f7d0517565e6.json\n>f+++++++++ 10/job_events/55-cb970ac3-7cbe-4e5d-83ce-0ca81e2464b5.json\n>f+++++++++ 10/job_events/56-16ea0bd8-9e98-4b7d-8f17-bb2315c1972d.json\n>f+++++++++ 10/job_events/57-d9fa5d0e-11b2-4db8-9485-623a7641b422.json\n>f+++++++++ 10/job_events/58-0dae698d-6b47-4346-a38d-e64d76034e96.json\n>f+++++++++ 10/job_events/6-e1079f1b-1120-45c6-bdd7-8a156e1acf29.json\n>f+++++++++ 10/job_events/7-561c7351-202a-4d9a-a627-fc23498a371d.json\n>f+++++++++ 10/job_events/8-fb732a0c-1f8f-4efb-81bf-66fb57d3628c.json\n>f+++++++++ 10/job_events/9-44958c36-75f0-48d9-b750-fe514f69abff.json\n",
"rc": 0,
"stdout_lines": [
"cd+++++++++ ./",
"cd+++++++++ 10/",
">f+++++++++ 10/command",
">f+++++++++ 10/rc",
">f+++++++++ 10/status",
">f+++++++++ 10/stdout",
"cd+++++++++ 10/fact_cache/",
">f+++++++++ 10/fact_cache/localhost",
"cd+++++++++ 10/job_events/",
">f+++++++++ 10/job_events/1-c1db96e5-1566-47ae-a543-a8361577e66b.json",
">f+++++++++ 10/job_events/10-d8bf934a-0fff-4ed3-9cc7-f6c041813de6.json",
">f+++++++++ 10/job_events/11-d436d6b2-3591-4148-8e11-ff5e4e557bf7.json",
">f+++++++++ 10/job_events/12-f1815fd9-d25f-421b-a5b2-b96ebc7a2cda.json",
">f+++++++++ 10/job_events/13-ecd15715-bf81-4c25-8877-b869f95e58a2.json",
">f+++++++++ 10/job_events/14-640dfd75-f0cb-4f72-ad5c-6b171ab86b8c.json",
">f+++++++++ 10/job_events/15-6c8ee9d5-9dd4-436b-8dcc-00013791faa7.json",
">f+++++++++ 10/job_events/16-e753e5c6-4648-4cf1-9f6a-c4f26dbd3bf1.json",
">f+++++++++ 10/job_events/17-bd26fd7a-255f-45a3-8083-07c7a51f28c6.json",
">f+++++++++ 10/job_events/18-f0d9f940-1b5b-4dcb-9b64-4c9c3925d091.json",
">f+++++++++ 10/job_events/19-cc2722a1-bc28-4e80-92e5-55e02d94bf2a.json",
">f+++++++++ 10/job_events/2-0f9791a8-f608-4058-8991-4bd69a143257.json",
">f+++++++++ 10/job_events/20-01f11bc1-da1c-4548-a0e5-50c585602b59.json",
">f+++++++++ 10/job_events/21-a50dda5c-6a38-4c56-a424-694e1a27cc3b.json",
">f+++++++++ 10/job_events/22-f64f2181-d570-4218-a680-63c8793dbf6a.json",
">f+++++++++ 10/job_events/23-8cddb386-e5e0-43df-970b-dfc556b6ff02.json",
">f+++++++++ 10/job_events/24-2b8bc2cd-21f3-48f0-baaf-30437b8c9f62.json",
">f+++++++++ 10/job_events/25-e80d1d71-f1b6-4f95-8d91-609a9c1029f5.json",
">f+++++++++ 10/job_events/26-aa3cdbc2-21f1-4ed5-88f4-4076090217d3.json",
">f+++++++++ 10/job_events/27-b1ebe45b-efa1-49c2-849a-c3c6626910e8.json",
">f+++++++++ 10/job_events/28-747e6a62-91e7-4a3c-9223-923616bc434c.json",
">f+++++++++ 10/job_events/29-3df54aad-d63c-41a2-863c-029716896b5c.json",
">f+++++++++ 10/job_events/3-5502c9e2-d2b3-4cf8-8aeb-3dd15285c3a6.json",
">f+++++++++ 10/job_events/30-3a21f0fb-fff6-4867-b82b-c678d913a4e0.json",
">f+++++++++ 10/job_events/31-c93223fc-0bda-46ec-a7ac-faf4454f84d9.json",
">f+++++++++ 10/job_events/32-21e43499-18c1-4bcd-a514-5498364aaeea.json",
">f+++++++++ 10/job_events/33-f4c4f51e-0ef1-4d89-8246-403e61633b6d.json",
">f+++++++++ 10/job_events/34-167644bc-e460-489d-a0e1-5aab8baea45f.json",
">f+++++++++ 10/job_events/35-3a98a6b3-20b7-4fb2-9b70-497c5cb247c8.json",
">f+++++++++ 10/job_events/36-092ab2dd-093a-40f8-b66f-b59ef29d2c3e.json",
">f+++++++++ 10/job_events/37-0a4683f7-b300-451c-8052-20759f155280.json",
">f+++++++++ 10/job_events/38-caf9f8dc-dbfe-4b4a-9da5-9fec9ffba6b8.json",
">f+++++++++ 10/job_events/39-b9a2f43b-62db-4d10-a93a-8f5e4d6d592a.json",
">f+++++++++ 10/job_events/4-9b01deb6-7f8d-4d0c-9b31-4c0daa696e29.json",
">f+++++++++ 10/job_events/40-87a46845-606d-4de6-960e-854dbd271bf3.json",
">f+++++++++ 10/job_events/41-0d0b3c5c-4afb-48ac-80f4-57fe9013370c.json",
">f+++++++++ 10/job_events/42-e6448903-1aac-462c-bad4-9434bd6ee2dc.json",
">f+++++++++ 10/job_events/43-82174046-e37b-436a-8d10-61ec291284b4.json",
">f+++++++++ 10/job_events/44-e0e5637d-1422-4e56-8d23-414602238574.json",
">f+++++++++ 10/job_events/45-6b18194c-1511-4286-ab72-9b85f9701f78.json",
">f+++++++++ 10/job_events/46-da9d2d99-c132-4778-8fcc-f2259ad6c644.json",
">f+++++++++ 10/job_events/47-5dca7a03-7bbd-43af-a071-924695aca873.json",
">f+++++++++ 10/job_events/48-9c32d463-b064-4ac5-a325-f0a3d9bfe344.json",
">f+++++++++ 10/job_events/49-4eea1f5f-f541-4215-8174-481151b4310a.json",
">f+++++++++ 10/job_events/5-0e76d8bf-3ddd-4709-9b4d-5703a65c7974.json",
">f+++++++++ 10/job_events/50-573eba8e-6850-484a-b6a2-558678586410.json",
">f+++++++++ 10/job_events/51-569d2db1-6827-4352-936c-19cdcd8106e3.json",
">f+++++++++ 10/job_events/52-bc3160d3-c22e-4e78-82e5-e69b8ddbc6c1.json",
">f+++++++++ 10/job_events/53-ceadb52d-5f98-40ac-b405-a28749d5945d.json",
">f+++++++++ 10/job_events/54-4015a98a-e6e7-40a5-8d3a-f7d0517565e6.json",
">f+++++++++ 10/job_events/55-cb970ac3-7cbe-4e5d-83ce-0ca81e2464b5.json",
">f+++++++++ 10/job_events/56-16ea0bd8-9e98-4b7d-8f17-bb2315c1972d.json",
">f+++++++++ 10/job_events/57-d9fa5d0e-11b2-4db8-9485-623a7641b422.json",
">f+++++++++ 10/job_events/58-0dae698d-6b47-4346-a38d-e64d76034e96.json",
">f+++++++++ 10/job_events/6-e1079f1b-1120-45c6-bdd7-8a156e1acf29.json",
">f+++++++++ 10/job_events/7-561c7351-202a-4d9a-a627-fc23498a371d.json",
">f+++++++++ 10/job_events/8-fb732a0c-1f8f-4efb-81bf-66fb57d3628c.json",
">f+++++++++ 10/job_events/9-44958c36-75f0-48d9-b750-fe514f69abff.json"
]
}
TASK [Copy daemon log from pod] ************************************************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:50
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: awx
<localhost> EXEC /bin/sh -c 'echo ~awx && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /var/lib/awx/.ansible/tmp `"&& mkdir "` echo /var/lib/awx/.ansible/tmp/ansible-tmp-1608561770.6993527-463507-200195490025535 `" && echo ansible-tmp-1608561770.6993527-463507-200195490025535="` echo /var/lib/awx/.ansible/tmp/ansible-tmp-1608561770.6993527-463507-200195490025535 `" ) && sleep 0'
Using module file /var/lib/awx/vendor/awx_ansible_collections/ansible_collections/ansible/posix/plugins/modules/synchronize.py
<localhost> PUT /var/lib/awx/.ansible/tmp/ansible-local-463283p0k6kjv6/tmp4odr9dni TO /var/lib/awx/.ansible/tmp/ansible-tmp-1608561770.6993527-463507-200195490025535/AnsiballZ_synchronize.py
<localhost> EXEC /bin/sh -c 'chmod u+x /var/lib/awx/.ansible/tmp/ansible-tmp-1608561770.6993527-463507-200195490025535/ /var/lib/awx/.ansible/tmp/ansible-tmp-1608561770.6993527-463507-200195490025535/AnsiballZ_synchronize.py && sleep 0'
<localhost> EXEC /bin/sh -c 'RSH='"'"'oc rsh --config=/tmp/awx_10_ewx2h344/.kubeconfig_qai0nct'"'"' /usr/bin/python3.6 /var/lib/awx/.ansible/tmp/ansible-tmp-1608561770.6993527-463507-200195490025535/AnsiballZ_synchronize.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /var/lib/awx/.ansible/tmp/ansible-tmp-1608561770.6993527-463507-200195490025535/ > /dev/null 2>&1 && sleep 0'
changed: [awx-job-10] => {
"changed": true,
"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --blocking-io --rsh=$RSH --out-format=<<CHANGED>>%i %n%L awx-job-10:/tmp/awx_10_ewx2h344/daemon.log /tmp/awx_10_ewx2h344/daemon.log",
"invocation": {
"module_args": {
"_local_rsync_password": null,
"_local_rsync_path": "rsync",
"_substitute_controller": false,
"archive": true,
"checksum": false,
"compress": true,
"copy_links": false,
"delete": false,
"dest": "/tmp/awx_10_ewx2h344/daemon.log",
"dest_port": null,
"dirs": false,
"existing_only": false,
"group": null,
"link_dest": null,
"links": null,
"mode": "pull",
"owner": null,
"partial": false,
"perms": null,
"private_key": null,
"recursive": null,
"rsync_opts": [
"--blocking-io",
"--rsh=$RSH"
],
"rsync_path": null,
"rsync_timeout": 0,
"set_remote_user": false,
"src": "awx-job-10:/tmp/awx_10_ewx2h344/daemon.log",
"ssh_args": null,
"times": null,
"verify_host": false
}
},
"msg": ">f+++++++++ daemon.log\n",
"rc": 0,
"stdout_lines": [
">f+++++++++ daemon.log"
]
}
TASK [Fail if previous check determined that process is not alive.] ************
task path: /var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/check_isolated.yml:64
skipping: [awx-job-10] => {
"changed": false,
"skip_reason": "Conditional result was False"
}
META: ran handlers
META: ran handlers
PLAY RECAP *********************************************************************
awx-job-10 : ok=3 changed=3 unreachable=0 failed=0 skipped=3 rescued=0 ignored=1
2020-12-21 14:42:52,369 DEBUG awx.main.tasks ad_hoc_command 10 (running) finished running, producing 58 events.
(Please note that this is for job 10 as I've been starting/stopping jobs during debugging.)
I'm now thinking the problem is with the ansible-runner library or docker image or something. Any help would be greatly appreciated.
What an adventure... look what I found in dmesg
[493642.521612] Memory cgroup stats for /kubepods/burstable/podbe9f3a11-7fa1-4f50-85d8-c02a8facb612: cache:0KB rss:0KB rss_huge:0KB shmem:0KB mapped_file:0KB dirty:0KB writeback:0KB swap:0KB inactive_anon:0KB active_anon:0KB inactive_file:0KB active_file:0KB unevictable:0KB
[493642.545735] Memory cgroup stats for /kubepods/burstable/podbe9f3a11-7fa1-4f50-85d8-c02a8facb612/966f76b0cd20920bc5314517e9ddaa86d9d5b535bf0c0f01f654438af9a1e86c: cache:0KB rss:52KB rss_huge:0KB shmem:0KB mapped_file:0KB dirty:0KB writeback:0KB swap:0KB inactive_anon:0KB active_anon:40KB inactive_file:0KB active_file:0KB unevictable:0KB
[493642.576139] Memory cgroup stats for /kubepods/burstable/podbe9f3a11-7fa1-4f50-85d8-c02a8facb612/51a8ef26578354eb4ee32774153d869a77646c52cd9f0c0b78b839226752fd08: cache:0KB rss:115228KB rss_huge:0KB shmem:0KB mapped_file:0KB dirty:0KB writeback:0KB swap:0KB inactive_anon:0KB active_anon:115192KB inactive_file:0KB active_file:0KB unevictable:0KB
[493642.606860] Tasks state (memory values in pages):
[493642.611806] [ pid ] uid tgid total_vm rss pgtables_bytes swapents oom_score_adj name
[493642.620972] [ 809956] 0 809956 256 1 32768 0 -998 pause
[493642.629492] [ 810086] 0 810086 1085 201 57344 0 999 tini
[493642.637911] [ 810112] 0 810112 6489 352 94208 0 999 sleep
[493642.646419] [ 810471] 0 810471 3767 926 77824 0 999 bash
[493642.654832] [ 826184] 0 826184 62510 29233 466944 0 999 ansible-runner
[493642.664114] Memory cgroup out of memory: Kill process 826184 (ansible-runner) score 1871 or sacrifice child
[493642.674081] Killed process 826184 (ansible-runner) total-vm:250040kB, anon-rss:114072kB, file-rss:2860kB, shmem-rss:0kB
So I need more memory... how silly.
Wow! Thanks for following up. Nice hunting 馃槄
I'm like super embarrassed it ended up being a resource limiting problem 馃う