Awx: Inventory Update using "Sourced from a Project" not working - AWX 7.0.0

Created on 14 Sep 2019  路  17Comments  路  Source: ansible/awx

ISSUE TYPE
  • Bug Report
SUMMARY
ENVIRONMENT
  • AWX version: 7.0.0
  • AWX install method: docker on linux
  • Ansible version: 2.8.4
  • Operating System: Debian
  • Web Browser: Chrome
STEPS TO REPRODUCE

Try to sync an inventory using "Sourced from a Project" source.
The job fails with no clear output. When visiting API I find the issue and pin it in specific job running before that

EXPECTED RESULTS

Inventory sync to happen

ACTUAL RESULTS

A failed Task

ADDITIONAL INFORMATION

Checking the inventory_update job number 980 I find this error:

{
    "id": 980,
    "type": "inventory_update",
    "url": "/api/v2/inventory_updates/980/",
    "related": {
        "created_by": "/api/v2/users/1/",
        "unified_job_template": "/api/v2/inventory_sources/28/",
        "stdout": "/api/v2/inventory_updates/980/stdout/",
        "inventory_source": "/api/v2/inventory_sources/28/",
        "cancel": "/api/v2/inventory_updates/980/cancel/",
        "notifications": "/api/v2/inventory_updates/980/notifications/",
        "events": "/api/v2/inventory_updates/980/events/",
        "source_project_update": "/api/v2/project_updates/981/",
        "inventory": "/api/v2/inventories/4/",
        "credentials": "/api/v2/inventory_updates/980/credentials/",
        "source_project": "/api/v2/projects/24/"
    },
    "summary_fields": {
        "inventory": {
            "id": 4,
            "name": "kubernetes",
            "description": "",
            "has_active_failures": false,
            "total_hosts": 0,
            "hosts_with_active_failures": 0,
            "total_groups": 0,
            "groups_with_active_failures": 0,
            "has_inventory_sources": true,
            "total_inventory_sources": 1,
            "inventory_sources_with_failures": 1,
            "organization_id": 1,
            "kind": ""
        },
        "unified_job_template": {
            "id": 28,
            "name": "Update inventory",
            "description": "",
            "unified_job_type": "inventory_update"
        },
        "inventory_source": {
            "source": "scm",
            "last_updated": "2019-09-14T13:42:08.653158Z",
            "status": "failed"
        },
        "instance_group": {
            "name": "tower",
            "id": 1
        },
        "created_by": {
            "id": 1,
            "username": "admin",
            "first_name": "",
            "last_name": ""
        },
        "user_capabilities": {
            "delete": true,
            "start": true
        },
        "credentials": [],
        "source_project": {
            "id": 24,
            "name": "Kubernetes",
            "description": "",
            "status": "successful",
            "scm_type": "git"
        }
    },
    "created": "2019-09-14T13:36:34.515351Z",
    "modified": "2019-09-14T13:36:34.714877Z",
    "name": "kubernetes - Update inventory",
    "description": "",
    "source": "scm",
    "source_path": "inventory",
    "source_script": null,
    "source_vars": "",
    "credential": null,
    "source_regions": "",
    "instance_filters": "",
    "group_by": "",
    "overwrite": false,
    "overwrite_vars": false,
    "custom_virtualenv": null,
    "timeout": 0,
    "verbosity": 1,
    "unified_job_template": 28,
    "launch_type": "manual",
    "status": "error",
    "failed": true,
    "started": "2019-09-14T13:36:34.822591Z",
    "finished": "2019-09-14T13:36:40.780413Z",
    "elapsed": 0.0,
    "job_args": "",
    "job_cwd": "",
    "job_env": {},
    "job_explanation": "Previous Task Failed: {\"job_type\": \"project_update\", \"job_name\": \"Kubernetes\", \"job_id\": \"981\"}",
    "execution_node": "ansibleworker1",
    "result_traceback": "Traceback (most recent call last):\n  File \"/usr/local/lib/python3.7/dist-packages/awx-7.0.0-py3.7.egg/awx/main/tasks.py\", line 1174, in run\n    self.pre_run_hook(self.instance, private_data_dir)\n  File \"/usr/local/lib/python3.7/dist-packages/awx-7.0.0-py3.7.egg/awx/main/tasks.py\", line 2330, in pre_run_hook\n    project_update_task().run(local_project_sync.id)\n  File \"/usr/local/lib/python3.7/dist-packages/awx-7.0.0-py3.7.egg/awx/main/tasks.py\", line 685, in _wrapped\n    return f(self, *args, **kwargs)\n  File \"/usr/local/lib/python3.7/dist-packages/awx-7.0.0-py3.7.egg/awx/main/tasks.py\", line 1330, in run\n    raise AwxTaskError.TaskError(self.instance, rc)\nException: project_update 981 (failed) encountered an error (rc=2), please see task stdout for details.\n",
    "event_processing_finished": true,
    "inventory": 4,
    "inventory_source": 28,
    "license_error": false,
    "org_host_limit_error": false,
    "source_project_update": 981,
    "source_project": 24
}

finding the Job 981, which never Appears in my UI I queried the API and:

{
    "id": 981,
    "type": "project_update",
    "url": "/api/v2/project_updates/981/",
    "summary_fields": {
        "project": {
            "id": 24,
            "name": "Kubernetes",
            "description": "",
            "status": "successful",
            "scm_type": "git"
        },
        "credential": {
            "id": 4,
            "name": "**",
            "description": "",
            "kind": "scm",
            "cloud": false,
            "credential_type_id": 2
        },
        "unified_job_template": {
            "id": 24,
            "name": "Kubernetes",
            "description": "",
            "unified_job_type": "project_update"
        },
        "instance_group": {
            "name": "tower",
            "id": 1
        },
        "user_capabilities": {
            "delete": true,
            "start": true
        }
    },
    "created": "2019-09-14T13:36:34.882981Z",
    "modified": "2019-09-14T13:36:34.882989Z",
    "name": "Kubernetes",
    "description": "",
    "local_path": "_24__kubernetes",
    "scm_type": "git",
    "scm_url": "*******",
    "scm_branch": "",
    "scm_refspec": "",
    "scm_clean": true,
    "scm_delete_on_update": true,
    "credential": 4,
    "timeout": 0,
    "scm_revision": "a79609e829f4908c94008e52ba9a9f73e4450ab2",
    "unified_job_template": 24,
    "launch_type": "sync",
    "status": "failed",
    "failed": true,
    "started": "2019-09-14T13:36:34.882822Z",
    "finished": "2019-09-14T13:36:40.760367Z",
    "elapsed": 5.878,
    "job_explanation": "",
    "execution_node": "ansibleworker1",
    "result_traceback": "",
    "event_processing_finished": true,
    "project": 24,
    "job_type": "run",
    "host_status_counts": {
        "failures": 1
    },
    "playbook_counts": {
        "play_count": 2,
        "task_count": 15
    }
}

Finally the error on the Runbook:

Identity added: /tmp/awx_981_u15xqles/artifacts/981/ssh_key_data ()
No config file found; using defaults

PLAY [all] *********************************************************************

TASK [delete project directory before update] **********************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [update project using git] ************************************************
ok: [localhost, -> localhost] => {"after": "a79609e829f4908c94008e52ba9a9f73e4450ab2", "before": "a79609e829f4908c94008e52ba9a9f73e4450ab2", "changed": false, "remote_url_changed": false}

TASK [Set the git repository version] ******************************************
ok: [localhost, -> localhost] => {"ansible_facts": {"scm_version": "a79609e829f4908c94008e52ba9a9f73e4450ab2"}, "changed": false}

TASK [update project using hg] *************************************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Set the hg repository version] *******************************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [parse hg version string properly] ****************************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [update project using svn] ************************************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Set the svn repository version] ******************************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [parse subversion version string properly] ********************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Ensure the project directory is present] *********************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Fetch Insights Playbook(s)] **********************************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Save Insights Version] ***************************************************
skipping: [localhost,] => {"changed": false, "skip_reason": "Conditional result was False"}

TASK [Repository Version] ******************************************************
ok: [localhost,] => {
    "msg": "Repository Version a79609e829f4908c94008e52ba9a9f73e4450ab2"
}

PLAY [all] *********************************************************************

TASK [detect requirements.yml] *************************************************
ok: [localhost, -> localhost] => {"changed": false, "stat": {"atime": 1568468115.431545, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 9, "charset": "us-ascii", "checksum": "78b666b9f4be6f267727d5e851e21569c251f2a7", "ctime": 1568468115.419545, "dev": 173, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 287311, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1568468115.419545, "nlink": 1, "path": "/var/lib/awx/projects/_24__kubernetes/roles/requirements.yml", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 258, "uid": 0, "version": null, "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}}

TASK [fetch galaxy roles from requirements.yml] ********************************
fatal: [localhost,]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'roles_destination' is undefined\\n\\nThe error appears to be in '/usr/local/lib/python3.7/dist-packages/awx-7.0.0-py3.7.egg/awx/playbooks/project_update.yml': line 128, column 9, but may\\nbe elsewhere in the file depending on the exact syntax problem.\\n\\nThe offending line appears to be:\\n\\n\\n      - name: fetch galaxy roles from requirements.yml\\n        ^ here\\n"}

PLAY RECAP *********************************************************************
localhost,                 : ok=4    changed=0    unreachable=0    failed=1    skipped=10   rescued=0    ignored=0   
api high regression bug

Most helpful comment

I can confirm this. I was using the containerized version and fixed by running:
$> docker exec -it awx_task
The path inside of the running container is:
/var/lib/awx/venv/awx/lib64/python3.6/site-packages/awx/playbooks
Updating the project-update.yml playbook to default to the cwd (as noted above) seems to resolve the issue.

All 17 comments

does your project have a roles/requirements.yml file?

I just hit this and the answer is yes from me.

Busy days.
I have a roles/requirements.yml and also the error comes only if you run an inventory sync manually.

If the inventory sync is triggered by the Project update it seems to work.

I was having the same problem but was trying to upgrade with previous data from Ansible AWX 4.0.0. Can't do. See FAQ for upgrade. Removing docker volumes, that is, doing an empty installation, fixed the problem.

Not using requirements.yml. Hope it helps.

We're also having this issue on 7.0.0 with a roles/requirements.yml file in place.

TASK [fetch galaxy roles from requirements.yml] ********************************
fatal: [localhost,]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'roles_destination' is undefined\\n\\nThe error appears to be in '/var/lib/awx/venv/awx/lib/python3.6/site-packages/awx/playbooks/project_update.yml': line 128, column 9, but may\\nbe elsewhere in the file depending on the exact syntax problem.\\n\\nThe offending line appears to be:\\n\\n\\n      - name: fetch galaxy roles from requirements.yml\\n        ^ here\\n"}

I have a user from the AWX-RPM project that reports that the following fixes this issue, can anyone confirm?
"/opt/rh/rh-python36/root/usr/lib/python3.6/site-packages/awx/playbooks/project_update.yml":

Changing
command: ansible-galaxy install -r requirements.yml -p {{roles_destination|quote}}
to:
command: ansible-galaxy install -r requirements.yml -p {{roles_destination|default('.')|quote}}

I can confirm this. I was using the containerized version and fixed by running:
$> docker exec -it awx_task
The path inside of the running container is:
/var/lib/awx/venv/awx/lib64/python3.6/site-packages/awx/playbooks
Updating the project-update.yml playbook to default to the cwd (as noted above) seems to resolve the issue.

Still happens after 8.0.0 upgrade.

@fidanf Did you try the workaround?

@fidanf Did you try the workaround?

Yes I did and it works. Basically nothing has changed between 7.0.0 and 8.0.0 regarding this issue.

Cool.. I just did a pull request for the change

I've pulled the changes for the fix aforementioned and did the following:

1) Created a project to sync https://github.com/ansible/test-playbooks.git and the with_requirements_inventory branch
2) Created a new source of type Sourced from a Project and selected the inventory file to be https://github.com/ansible/test-playbooks/blob/with_requirements_inventory/inventories/inventory.ini
3) Tried to sync the inventory and it worked fine here is the output I got when checking the job:

    2.833 INFO     Loaded 0 groups, 1 hosts
    3.045 INFO     Inventory import completed for Inventory in 0.9s

I will leave this opened until the changes are merged into devel which should happen soon.

@elyezer (and others in this thread) a fix for this is present in devel now:

https://github.com/ansible/awx/pull/5140

Anybody here who's been encountering this issue mind giving awx devel a shot?

@aretakisv (and others here)

We believe we've finally squashed this issue in the just-released version of AWX, 9.0.0.

I'm going to close this - if anybody else is still encountering this issue on the latest version of AWX, please let us know and we'll re-open and investigate.

@ryanpetrello

Have the same issue after upgrading to 8.0.0
Workaround didn't work and inventory update job works in no way - either manual or automaic upon job launch.
Maybe anything else I can do with that?

N.B. I can't update to later version/s now (9.0.X) due to several reasons, but if there will be no other way...

Narrowing down @ryanpetrello's link, the fix is in the commit:

acba5306c69edc2493826b0bfad891702ff3ce53

github will helpfully show you what tags a commit is in. Here, that is 9.0.1 9.0.0.

Is there an official upgrade pathbetween versions yet?

Was this page helpful?
0 / 5 - 0 ratings