found error like that:
TASK [kubernetes/preinstall : set_fact] ************
fatal: [lk-k8s-minion-1]: FAILED! => {"failed": true, "msg": "the field 'args' has an invalid value, which appears to include a variable that is undefined. The error was: 'dict object' has no attribute 'ansible_default_ipv4'\n\nThe error appears to have been in '/etc/ansible/kargo/roles/kubernetes/preinstall/tasks/set_facts.yml': line 14, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- set_fact:\n ^ here\n"}
Is this a BUG REPORT or FEATURE REQUEST? (choose one):
Environment:
printf "$(uname -srm)\n$(cat /etc/os-release)\n"):CENTOS_MANTISBT_PROJECT="CentOS-7"
CENTOS_MANTISBT_PROJECT_VERSION="7"
REDHAT_SUPPORT_PRODUCT="centos"
REDHAT_SUPPORT_PRODUCT_VERSION="7"
ansible --version):Kargo version (commit) (git rev-parse --short HEAD):
acae0fe
Network plugin used:
Copy of your inventory file:
Command used to invoke ansible:
/usr/bin/ansible-playbook -i /etc/ansible/kargo/inventory/hosts /etc/ansible/kargo/cluster.yml --limit k8s-minion-1
Output of ansible run:
Anything else do we need to know:
Reproduced the same issue. Did you fixed/workaround it?
Can't add new nodes to a cluster created by Kubespray due to Ansible error:
TASK [kubernetes/preinstall : set_fact] *****************************************************************************************************************************************************
fatal: [node7]: FAILED! => {"failed": true, "msg": "the field 'args' has an invalid value, which appears to include a variable that is undefined. The error was: 'unicode object' has no attribute 'address'\n\nThe error appears to have been in '/home/user/.kargo/roles/kubernetes/preinstall/tasks/set_facts.yml': line 14, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- set_fact:\n ^ here\n"}
The command used to add a node:
ansible-playbook -u root --become-user=root -i ~/.kargo/inventory/inventory.cfg ~/.kargo/cluster.yml -l node7
Inventory:
[kube-master]
node4
node5
[all]
node4 ansible_ssh_host=192.168.56.104
node5 ansible_ssh_host=192.168.56.105
node6 ansible_ssh_host=192.168.56.106
node7 ansible_ssh_host=192.168.56.107
[k8s-cluster:children]
kube-node
kube-master
[kube-node]
node4
node5
node6
node7
[etcd]
node4
node5
node6
Environment:
ansible 2.3.1.0
kargo 0.4.8 bddee7c38e386740b453430fe1c4c69266b08580
I ran into this problem also when I tried to add nodes to my Kubernetes cluster using kubespray and limit. The solution was:
Look at ansible.cfg for caching options, delete cache from /tmp if it exists and then run ansible -i {{inventory}} all -m setup to re create the cache.
After running the setup command, rerun kubespray with the --limit option and it works.
thank you for the solution @alvinhom
Most helpful comment
I ran into this problem also when I tried to add nodes to my Kubernetes cluster using kubespray and limit. The solution was:
Look at ansible.cfg for caching options, delete cache from /tmp if it exists and then run
ansible -i {{inventory}} all -m setupto re create the cache.After running the setup command, rerun kubespray with the --limit option and it works.