I'm trying to use ansible provisioner in connection with AWS Session Manager (SSM). While it is possible to use "shell" as provisioner with SSM, I keep failing with ansible. I'm not 100% sure that this is a bug but right now it looks like one.
Packer v1.6.4
{
"variables": {
"aws_access_key": "",
"aws_secret_key": "",
"timestamp": "{{timestamp}}",
"iam_profile": "ssm",
"sg_id": "sg-fdsa",
"subnet_id": "subnet-asdf"
},
"sensitive-variables": ["aws_access_key", "aws_secret_key"],
"builders": [
{
"type": "amazon-ebs",
"ssh_pty": true,
"access_key": "{{user `aws_access_key`}}",
"secret_key": "{{user `aws_secret_key`}}",
"region": "eu-central-1",
"subnet_id": "{{user `subnet_id`}}",
"security_group_id": "{{user `sg_id`}}",
"instance_type": "t3a.small",
"source_ami_filter": {
"filters": {
"virtualization-type": "hvm",
"name": "ubuntu/images/hvm-ssd/ubuntu*20.04*amd64*server*",
"root-device-type": "ebs"
},
"owners": ["099720109477"],
"most_recent": true
},
"ssh_username": "ubuntu",
"ssh_timeout": "5m",
"ssh_interface": "session_manager",
"ssh_keypair_name": "pipestack-test",
"ssh_private_key_file": "~/.ssh/pipestack-test.pem",
"session_manager_port": "8122",
"communicator": "ssh",
"iam_instance_profile": "{{user `iam_profile`}}",
"ami_name": "pipestack/repo {{timestamp}}"
}
],
"provisioners": [
{
"type": "shell",
"inline": [
"echo Connected via SSM at '{{build `User`}}@{{build `Host`}}:{{build `Port`}}'"
]
},
{
"type": "ansible",
"groups": ["artifactory"],
"user": "ubuntu",
"use_proxy": false,
"extra_arguments": ["-l artifactory", "-vvvv"],
"sftp_command": "/usr/libexec/openssh/sftp-server -e",
"playbook_file": "../../test_playbook.yml"
}
]
}
MacOS 10.15.4 (19E287)
The shell provisioner works:
==> amazon-ebs: Provisioning with shell script: /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-shell130649063
2020/10/02 18:55:15 packer-provisioner-shell plugin: Opening /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-shell130649063 for reading
2020/10/02 18:55:15 packer-provisioner-shell plugin: [INFO] 73 bytes written for 'uploadData'
2020/10/02 18:55:15 [INFO] 73 bytes written for 'uploadData'
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] Opening new ssh session
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] Starting remote scp process: scp -vt /tmp
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] Started SCP session, beginning transfers...
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] Copying input data into temporary file so we can read the length
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] scp: Uploading script_9661.sh: perms=C0644 size=73
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] SCP session complete, closing stdin pipe.
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] Waiting for SSH session to complete.
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] scp stderr (length 30): Sink: C0644 73 script_9661.sh
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] Opening new ssh session
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] starting remote command: chmod 0755 /tmp/script_9661.sh
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [INFO] RPC endpoint: Communicator ended with: 0
2020/10/02 18:55:15 [INFO] RPC client: Communicator ended with: 0
2020/10/02 18:55:15 [INFO] RPC endpoint: Communicator ended with: 0
2020/10/02 18:55:15 packer-provisioner-shell plugin: [INFO] RPC client: Communicator ended with: 0
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] Opening new ssh session
2020/10/02 18:55:15 packer-builder-amazon-ebs plugin: [DEBUG] starting remote command: chmod +x /tmp/script_9661.sh; PACKER_BUILDER_TYPE='amazon-ebs' PACKER_BUILD_NAME='amazon-ebs' /tmp/script_9661.sh
amazon-ebs: Connected via SSM at ubuntu@localhost:8122
The ansible provosioner doesn't:
==> amazon-ebs: Provisioning with Ansible...
amazon-ebs: Using ssh keys from Packer communicator...
amazon-ebs: Not using Proxy adapter for Ansible run:
amazon-ebs: Using ssh keys from Packer communicator...
2020/10/02 18:57:15 packer-provisioner-ansible plugin: Creating inventory file for Ansible run...
==> amazon-ebs: Executing Ansible: ansible-playbook -e packer_build_name="amazon-ebs" -e packer_builder_type=amazon-ebs --ssh-extra-args '-o IdentitiesOnly=yes' -l artifactory -vvvv -e ansible_ssh_private_key_file=~/.ssh/pipestack-test.pem -i /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-provisioner-ansible721145347 /Users/nell/projects/ansible/test_playbook.yml
amazon-ebs: ansible-playbook 2.9.9
amazon-ebs: config file = None
amazon-ebs: configured module search path = ['/Users/nell/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
amazon-ebs: ansible python module location = /Users/nell/Library/Python/3.7/lib/python/site-packages/ansible
amazon-ebs: executable location = /Users/nell/bin/python/ansible-playbook
amazon-ebs: python version = 3.7.3 (default, Apr 24 2020, 18:51:23) [Clang 11.0.3 (clang-1103.0.32.62)]
amazon-ebs: No config file found; using defaults
amazon-ebs: setting up inventory plugins
amazon-ebs: host_list declined parsing /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-provisioner-ansible721145347 as it did not pass its verify_file() method
amazon-ebs: script declined parsing /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-provisioner-ansible721145347 as it did not pass its verify_file() method
amazon-ebs: auto declined parsing /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-provisioner-ansible721145347 as it did not pass its verify_file() method
amazon-ebs: Parsed /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-provisioner-ansible721145347 inventory source with ini plugin
amazon-ebs: Loading callback plugin default of type stdout, v2.0 from /Users/nell/Library/Python/3.7/lib/python/site-packages/ansible/plugins/callback/default.py
amazon-ebs:
amazon-ebs: PLAYBOOK: test_playbook.yml ****************************************************
amazon-ebs: Positional arguments: /Users/nell/projects/ansible/test_playbook.yml
amazon-ebs: verbosity: 4
amazon-ebs: connection: smart
amazon-ebs: timeout: 10
amazon-ebs: ssh_extra_args: '-o IdentitiesOnly=yes'
amazon-ebs: become_method: sudo
amazon-ebs: tags: ('all',)
amazon-ebs: inventory: ('/var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-provisioner-ansible721145347',)
amazon-ebs: subset: artifactory
amazon-ebs: extra_vars: ('packer_build_name="amazon-ebs"', 'packer_builder_type=amazon-ebs', 'ansible_ssh_private_key_file=~/.ssh/pipestack-test.pem')
amazon-ebs: forks: 5
amazon-ebs: 1 plays in /Users/nell/projects/ansible/test_playbook.yml
amazon-ebs:
amazon-ebs: PLAY [artifactory] *************************************************************
amazon-ebs:
amazon-ebs: TASK [Gathering Facts] *********************************************************
amazon-ebs: task path: /Users/nell/projects/ansible/test_playbook.yml:10
amazon-ebs: <localhost> ESTABLISH SSH CONNECTION FOR USER: ubuntu
amazon-ebs: <localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=8122 -o 'IdentityFile="/Users/nell/.ssh/pipestack-test.pem"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ubuntu"' -o ConnectTimeout=10 '-o IdentitiesOnly=yes' -o ControlPath=/Users/nell/.ansible/cp/e79c318a61 localhost '/bin/sh -c '"'"'echo ~ubuntu && sleep 0'"'"''
amazon-ebs: <localhost> (255, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/nell/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug1: Control socket "/Users/nell/.ansible/cp/e79c318a61" does not exist\r\ndebug2: resolving "localhost" port 8122\r\ndebug2: ssh_connect_direct\r\ndebug1: Connecting to localhost [::1] port 8122.\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug1: connect to address ::1 port 8122: Connection refused\r\ndebug1: Connecting to localhost [127.0.0.1] port 8122.\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug1: connect to address 127.0.0.1 port 8122: Operation timed out\r\nssh: connect to host localhost port 8122: Operation timed out\r\n')
amazon-ebs: fatal: [default]: UNREACHABLE! => {
amazon-ebs: "changed": false,
amazon-ebs: "msg": "Failed to connect to the host via ssh: OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/nell/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug1: Control socket \"/Users/nell/.ansible/cp/e79c318a61\" does not exist\r\ndebug2: resolving \"localhost\" port 8122\r\ndebug2: ssh_connect_direct\r\ndebug1: Connecting to localhost [::1] port 8122.\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug1: connect to address ::1 port 8122: Connection refused\r\ndebug1: Connecting to localhost [127.0.0.1] port 8122.\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug1: connect to address 127.0.0.1 port 8122: Operation timed out\r\nssh: connect to host localhost port 8122: Operation timed out",
amazon-ebs: "unreachable": true
amazon-ebs: }
amazon-ebs:
amazon-ebs: PLAY RECAP *********************************************************************
amazon-ebs: default : ok=0 changed=0 unreachable=1 failed=0 skipped=0 rescued=0 ignored=0
amazon-ebs:
2020/10/02 18:57:28 [INFO] (telemetry) ending ansible
I'm not 100% sure but you may need to set use_proxy: true
when using SSM because there's not an ip address ansible can hit directly or connect to directly via SSH.
Are you able to run Ansible against the instance _outside of Packer_ through your SSM client?
Hello @SwampDragons thank you for your quick reply! I'm also not sure if I need use_proxy: true
should be set. I can remember that this does not work with ansible yet (https://github.com/hashicorp/packer/issues/9034#issuecomment-624753068). But anyway... I uploaded the logs with use_proxy: true
here: https://gist.github.com/odbaeu/737d115449f8643203378d429f542d00
I continued some testing... there's on thing that does not seem right. For testing purpose I configured this shell provisioner to pause for a while. But I'm not able to establish an ssh connection through the tunnel which packer opens for it's provisioners:
Here's the config:
"builders": [
{
...
"ssh_interface": "session_manager",
"session_manager_port": "8122",
...
}
]
"provisioners": [
{
"type": "shell",
"inline": [
"echo Connected via SSM at '{{build `User`}}@{{build `Host`}}:{{build `Port`}}'",
"sleep 520"
]
}
]
I try to open a connection with ssh -i some_key.pem -p 8122 ubuntu@localhost
which results in a timeout. I can telnet into port 8122 successfully but no ssh. If I understood the mechanism correctly, this should work.
I don't quite understand why we need to establish a ssh connection in the first place (builders block). SSM is available from public and ansible could just ssh into the aws instance via SSM.
It's an architectural thing. Configuring that SSH connection is how most of the provisioners are able to connect to the builder. Ansible is a bit of a special snowflake, since it manages its own ssh connection internally. However, because of the plugin architecture of Packer, Packer still expects a "communicator" (in this case the SSH communicator) to be defined regardless of what plugin is used.
If a use_proxy is false, the Ansible plugin pulls information out of that communicator to figure out how to set up its own connection settings. If use_proxy is true, the ansible plugin sets up a proxy to send ansible requests through, and the proxy pipes commands through the ssh communicator. This is useful for instances that don't actually have an IP address, but adds complexity and has broken behavior for Ansible > 2.8 because of some kind of issue with the proxy and Ansible's pipelineing behavior.
If you want, you can always set "communicator": "none"
, and launch the SSM service yourself and run Ansible via the shell-local provisioner. Various piecces of connection info can be found using the "build" template engine: https://www.packer.io/docs/templates/engine#build
Hope that context helps! @nywilken said he'd take a closer look at this on Monday for you :)
Thank you for your detailed explanations! I think that the real problem is that the tunneled port (here: 8122) does not work.
The communicator uses SSM to establish a connection to the ec2 instance and opens the port 8122. The provisioner should then connect to localhost:8122. (just to verify, that I understood this correctly)
I can see the open port 8122 with netstat. But I cannot connect to this port. Not with ssh, nor with ansible. I think this is the reason why ansible tells me "UNREACHABLE".
I don't quite understand why the shell provisioner can connect. Maybe this is because it is some sort of internal go-ssh thing and not an external process like ansible or openssh.
Meanwhile I'll test your workaround with "communicator": "none"
.
It seems to me that "communicator": "none"
does not work, because amazon-ebs
wants to connect to the instance before starting any provisioner. The ssh connection which is established by the communicator won't be used by ansible but that's fine for me. Maybe I have to use the shell provisioner to do things before or after ansible; like printing build information, etc.
The following solution work and is quite convenient.
Here is my configuration, I'll comment it below:
{
"variables": {
"aws_access_key": "",
"aws_secret_key": "",
"timestamp": "{{isotime \"2006-01-02\"}}",
"iam_profile": "nell-ssm"
},
"sensitive-variables": ["aws_access_key", "aws_secret_key"],
"builders": [
{
"type": "amazon-ebs",
"ssh_pty": true,
"access_key": "{{user `aws_access_key`}}",
"secret_key": "{{user `aws_secret_key`}}",
"region": "eu-central-1",
"instance_type": "t3a.nano",
"source_ami_filter": {
"filters": {
"virtualization-type": "hvm",
"name": "ubuntu/images/hvm-ssd/ubuntu*20.04*amd64*server*",
"root-device-type": "ebs"
},
"owners": ["099720109477"],
"most_recent": true
},
"communicator": "ssh",
"ssh_interface": "session_manager",
"ssh_username": "ubuntu",
"ssh_timeout": "5m",
"iam_instance_profile": "{{user `iam_profile`}}",
"ami_name": "dummy {{timestamp}}"
}
],
"provisioners": [
{
"type": "ansible",
"groups": ["dummy_group"],
"user": "ubuntu",
"use_proxy": false,
"sftp_command": "/usr/libexec/openssh/sftp-server -e",
"playbook_file": "../../site.yml",
"inventory_file_template": "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }}\n"
}
]
}
inventory_file_template
: I used the default template and replaced ansible_host={{ .Host }}
with ansible_host={{ .ID }}
. This will print the instance name (like i-08bf4a7ccd2657d38
) into the inventory file.~/.ssh/config
to be configured correctly. Alternatively you can use: "inventory_file_template": "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }} ansible_ssh_common_args='-o StrictHostKeyChecking=no -o ProxyCommand=\"sh -c \\\"aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters portNumber=%p\\\"\"'\n"
This is great! we'll tinker with this and if we can reproduce we'll make sure to document this as a way to use ansible with SSM. Thanks for updating.
Hi there @odbaeu thanks for walking us through the steps you've taken to try and resolve this issue. I'm looking into this a little deeper to see what might be going on when using Ansible with SSM.
You mentioned not being able to connect via SSH outside of Packer via ssh -i <key> -p #### ubuntu@localhost
. This should work as you expect because all Packer is doing is setting up a PortForwarding session - similar to the workaround you posted with Ansible.
My first thought on why SSH might be failing is that localhost is not set to 127.0.0.1.
What entry do you have for localhost in your local host file? Could you try ssh -v -i <key> -p #### ubuntu@localhost
to see what address SSH is trying to connect to, please.
Looking at the Ansible log output it looks like localhost might be configured for an IPV6 address as well as 127.0.0.1. With the IPV6 entry for localhost configured first.
Connecting to localhost [::1] port 8122.\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug1: connect to address ::1 port 8122: Connection refused\r\ndebug1: Connecting to localhost [127.0.0.1] port 8122.\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug1: connect to address 127.0.0.1 port 8122: Operation timed out\r\nssh: connect to host localhost port 8122: Operation timed out\r\n')
As an experiment I updated my host file from
⇶ cat /etc/hosts
127.0.0.1 localhost
127.0.1.1 <redacted>
# The following lines are desirable for IPv6 capable hosts
::1 localhost ip6-localhost ip6-loopback
ff02::1 ip6-allnodes
to
⇶ cat /etc/hosts
::1 localhost ip6-localhost ip6-loopback
127.0.0.1 localhost
127.0.1.1 <redacted>
# The following lines are desirable for IPv6 capable hosts
ff02::1 ip6-allnodes
And confirmed that Packer is still able to establish a connection but that I am no longer able to connect via SSH outside of Packer.If your host file is setup so that the IPV6 address is first could you check if changing the order in the host file changes the connection for Ansible, please.
Hello again, I spent some more time looking into this issue and found that Ansible will create an implicit target for localhost if localhost is not defined in the inventory_file.
https://docs.ansible.com/ansible/latest/inventory/implicit_localhost.html
I believe the implicit localhost target might be the issue here. As setting the transport to local for the Anaible provisioner allows my test build to make the connection via SSM with "use_proxy": false
.
Can you tell if adding "--connection=local"
as an extra argument to the provisioner fixes the issue for you?
"provisioners": [
{
"type": "ansible",
"groups": ["dummy_group"],
"user": "ubuntu",
"extra_arguments": ["--connection=local"],
"use_proxy": false,
"sftp_command": "/usr/libexec/openssh/sftp-server -e",
"playbook_file": "../../site.yml",
"inventory_file_template": "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }}\n"
}
]
Thanks!
This is great! we'll tinker with this and if we can reproduce we'll make sure to document this as a way to use ansible with SSM. Thanks for updating.
Just want to point out that I do intend to test this and document as an alternative option. But I would like to understand why the connection is failing in the first place.
If setting the transport to local as specified in my previous comment works. I will make sure to include that in the updated documentation as well.
Thanks again.
Hi @nywilken! I'm sorry, I had too many things going on yesterday...
I used the following provisioner (rest is the same as above):
"provisioners": [
{
"type": "shell",
"inline": [
"echo Connected via SSM at '{{build `User`}}@{{build `Host`}}:{{build `Port`}}'",
"sleep 5000"
]
}
]
The sleep 5000
gives me time for testing.
Once the Packer bild instance is up and running...
==> amazon-ebs: Prevalidating any provided VPC information
==> amazon-ebs: Prevalidating AMI Name: dummy 2020-10-07_06-39-01
amazon-ebs: Found Image ID: ami-00caf1798495a2300
==> amazon-ebs: Creating temporary keypair: packer_5f7d6285-5270-dc81-e74a-51e6ef97ce41
==> amazon-ebs: Creating temporary security group for this instance: packer_5f7d6287-802b-4e04-cfd4-855fc0ee3c55
==> amazon-ebs: Creating temporary instance profile for this instance: packer-5f7d6287-369f-693e-192a-6277c76327aa
==> amazon-ebs: Creating temporary role for this instance: packer-5f7d6287-369f-693e-192a-6277c76327aa
==> amazon-ebs: Attaching policy to the temporary role: packer-5f7d6287-369f-693e-192a-6277c76327aa
==> amazon-ebs: Launching a source AWS instance...
==> amazon-ebs: Adding tags to source instance
amazon-ebs: Adding tag: "Name": "Packer Builder"
amazon-ebs: Instance ID: i-0aeb68bb12f2cb7ae
==> amazon-ebs: Waiting for instance (i-0aeb68bb12f2cb7ae) to become ready...
==> amazon-ebs: Waiting 10s for establishing the SSM session...
amazon-ebs: PortForwarding session "i-0aeb68bb12f2cb7ae" has been started
==> amazon-ebs: Using ssh communicator to connect: localhost
==> amazon-ebs: Waiting for SSH to become available...
==> amazon-ebs: Connected to SSH!
==> amazon-ebs: Provisioning with shell script: /var/folders/b7/5jfym81d41qgc7g9h7f9tlgh0000gp/T/packer-shell297502544
amazon-ebs: Connected via SSM at ubuntu@localhost:8122
I try to open a ssh connection via port 8122:
ssh -p 8122 127.0.0.1
(no response)
````
I used `tcpdump` to investigate the network traffic (started listening before the ssh command):
```zsh
sudo tcpdump -n -i lo0 port 8122
08:44:48.275429 IP 127.0.0.1.51394 > 127.0.0.1.8122: Flags [S], seq 515758059, win 65535, options [mss 16344,nop,wscale 6,nop,nop,TS val 868596415 ecr 0,sackOK,eol], length 0
08:44:48.275585 IP 127.0.0.1.8122 > 127.0.0.1.51394: Flags [S.], seq 3776112844, ack 515758060, win 65535, options [mss 16344,nop,wscale 6,nop,nop,TS val 868596415 ecr 868596415,sackOK,eol], length 0
08:44:48.275599 IP 127.0.0.1.51394 > 127.0.0.1.8122: Flags [.], ack 1, win 6379, options [nop,nop,TS val 868596415 ecr 868596415], length 0
08:44:48.275609 IP 127.0.0.1.8122 > 127.0.0.1.51394: Flags [.], ack 1, win 6379, options [nop,nop,TS val 868596415 ecr 868596415], length 0
08:44:48.276435 IP 127.0.0.1.51394 > 127.0.0.1.8122: Flags [P.], seq 1:22, ack 1, win 6379, options [nop,nop,TS val 868596416 ecr 868596415], length 21
08:44:48.276454 IP 127.0.0.1.8122 > 127.0.0.1.51394: Flags [.], ack 22, win 6379, options [nop,nop,TS val 868596416 ecr 868596416], length 0
As already mentioned in a previous post, I can establish a telnet session:
telnet 127.0.0.1 8122
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
BTW: ipv6 does not work either:
ssh -p 8122 ::1
ssh: connect to host ::1 port 8122: Connection refused
Hi @obilodeau no worries. Thank you for the follow up and for your help here. I believe I figured out the issue with Ansible not being able to connect.
Could you tell me if you still run into errors if you update your ansible provisioner to look like the following?
"provisioners": [
{
"type": "ansible",
"groups": ["dummy_group"],
"user": "ubuntu",
"extra_arguments": ["--connection=local"],
"use_proxy": false,
"sftp_command": "/usr/libexec/openssh/sftp-server -e",
"playbook_file": "../../site.yml",
"inventory_file_template": "{{ .HostAlias }} ansible_host={{ .ID }} ansible_user={{ .User }}\n"
}
]
Hey @nywilken, emmm... :-) no, that does not really solve the problem :-). The ansible playbook is being executed on my local machine.
- hosts: dummy_group
tasks:
- name: debug message
debug:
msg: "asdf"
- name: touch file
file:
path: /tmp/asdf
state: touch
Afterwards I have a new file on my Mac:
ls /tmp/asdf
/tmp/asdf
I think the key to this issue is that I can't connect to to local port 8122. Maybe that's a Mac issue or it's an issue with the go library. I never had such a problem before that I can't use a local port which was opened via ssh tunneling.
UPDATE: Does not seem to be a Mac issue. I reproduced the error in my ubuntu 20.04 VM.
I did some more digging... I tested if AWS Session Manager may be the cause. But I was able to open a SSH session through all of the following tunnels (port 8222):
# private hostname of the packer instance:
ssh -L 8222:ip-172-31-3-48.eu-central-1.compute.internal:22 -i ~/.ssh/pipestack-test.pem ubuntu@i-073b779f7b612369e
ssh -L 8222:127.0.0.1:22 -i ~/.ssh/pipestack-test.pem ubuntu@i-073b779f7b612369e
ssh -L 8222:localhost:22 -i ~/.ssh/pipestack-test.pem ubuntu@i-073b779f7b612369e
Tunnel test:
ssh -i ~/.ssh/test.pem -p 8222 [email protected]
Hi @odbaeu thanks for the details. I now understand the local connection setting better. Probably would've helped if I had a task that was actually modifying the instance to confirm it was working :facepalm: Thanks for checking it.
Given your new information, coupled with your reports of it failing on Mac and Ubuntu I thought that maybe SSM is choking with multiple connections. So I revisited my test and found that I am able to connect because I am connecting to the remote instance before Packer actually makes the SSH connection to the remote host; I'm using packer build -debug template.json
which allows me to step through the build and make the connection once the SSM tunnel is open.
I tested your sleep and found that I too am not able to connect.
Which suggests that SSM is not allowing the connection. I retested and found that SSH hangs when there is an already active connection; specifying a timeout like the ansible provisioner will eventually timeout the SSH attempt. See screen shot below:
I then found this post on the community forums https://forums.aws.amazon.com/thread.jspa?threadID=314882
For the record I am using v1.1.54.0 of the AWS session-manager-pluign
With all that being said, your workaround of using an interactive session for SSM with Ansible seems to be the way to go here. I'm going to update the documentation to include your workaround for using Ansible with SSM and use_proxy = false.
Thanks for taking the time to answer my questions as I tried to debunk the situation and for your help in finding the path forward.
Do you use the SSM's default port forwarding? As stated in the forum thread you posted, it works when the tunnel is being opened by ssh. I could reproduce this statement:
Open ssh connection via SSM and forward local port 8222 to 22 of the packer instance:
ssh -L 8222:localhost:22 -i ~/.ssh/pipestack-test.pem ubuntu@i-0c62b510ce8e26d77
Afterwards I use two sessions that connect to 127.0.0.1:8222 at the same time:
But anyway... the workaround ist pretty good. When amazon fixes this problem, we can start using the tunnel of the communicator again.
Thank you for your quick responses! I don't know any git project that responses as quick as you do.
I'm going to lock this issue because it has been closed for _30 days_ ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.