Packer: SSH timeout for azure build using `custom_managed_image_name`

Created on 11 Jan 2018  ยท  7Comments  ยท  Source: hashicorp/packer

  • Packer v1.1.3
  • Azure

When building with Azure images it works:

 {
    "type": "azure-arm",

    "client_id": "{{user `azure_client_id`}}",
    "client_secret": "{{user `azure_client_secret`}}",
    "subscription_id": "{{user `azure_subscription_id`}}",

    "managed_image_resource_group_name": "production-eastus",
    "managed_image_name": "{{user `stack`}}-{{user `image_tag`}}",

    "os_type": "Linux",
    "image_publisher": "OpenLogic",
    "image_offer": "CentOS",
    "image_sku": "6.9",

    "azure_tags": {
        "OS_Version": "Centos",
        "Release": "Latest",
        "Version": "{{user `version`}}",
        "Created": "{{isotime}}",
        "Vendor": "{{user `vendor`}}",
        "Vendor_Url": "{{user `vendor_url`}}",
        "Product": "{{user `product`}}",
        "Stack": "{{user `stack`}}",
        "Commit_ID": "{{user `commit_id`}}"
    },

    "location": "East US",
    "vm_size": "Standard_A2"
}

I'm using the created artifact(managed_image_name) from above build and running it to build new image:

_base-2018-01-09-1515504716- is the artifact created with first build:_

{
    "type": "azure-arm",

    "client_id": "{{user `azure_client_id`}}",
    "client_secret": "{{user `azure_client_secret`}}",
    "subscription_id": "{{user `azure_subscription_id`}}",

    "managed_image_resource_group_name": "production-eastus",
    "managed_image_name": "{{user `stack`}}-{{user `image_tag`}}",
    "os_type": "Linux",
    "custom_managed_image_name": "base-2018-01-09-1515504716",
    "custom_managed_image_resource_group_name": "production-eastus",

    "azure_tags": {
        "OS_Version": "Centos",
        "Release": "Latest",
        "Version": "{{user `version`}}",
        "Created": "{{isotime}}",
        "Vendor": "{{user `vendor`}}",
        "Vendor_Url": "{{user `vendor_url`}}",
        "Product": "{{user `product`}}",
        "Stack": "{{user `stack`}}",
        "Commit_ID": "{{user `commit_id`}}"
    },

    "location": "East US",
    "vm_size": "Standard_A2"
}

I get this error:

==> azure-arm: Error waiting for SSH: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain

Running in debug mode:

packer: [INFO] Attempting SSH connection...
packer: reconnecting to TCP connection for SSH
packer: handshaking with SSH
packer: handshake error: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
packer: [DEBUG] SSH handshake err: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
packer: [DEBUG] Detected authentication error. Increasing handshake attempts.
packer: [INFO] Attempting SSH connection...

Please assist.

buildeazure docs question

All 7 comments

This is better directed towards the support community as this isn't a bug in Packer. The community can help with general questions.

Packer needs to be able to communicate with the VM, and it does so using SSH public key authentication. Packer is telling you that it failed to authenticate using the key is generated at build time. This is a custom image, so you will need to determine why that key is not usable.

The debugging docs are probably a good start.

For anyone else who wanders here from Google with this issue - the problem had to do with how Azure does initial-boot-configuration for its linux VMs (cloud-init stuff).

I solved it by adding this provisioner to the end of each build, resetting the cloud-init process:

{
  "type": "shell",
  "execute_command": "chmod +x {{ .Path }}; {{ .Vars }} sudo -E sh '{{ .Path }}'",
  "inline": [
    "/usr/sbin/waagent -force -deprovision+user && export HISTSIZE=0 && sync"
  ],
  "inline_shebang": "/bin/sh -x"
}

This fails with packer 1.2.2

23:45:13 ==> azure-arm: Getting the VM's IP address ...
23:45:13 ==> azure-arm: -> ResourceGroupName : 'packerrg'
23:45:13 ==> azure-arm: -> PublicIPAddressName : 'pkripb3hlqt7gvk'
23:45:13 ==> azure-arm: -> NicName : 'pkrnib3hlqt7gvk'
23:45:13 ==> azure-arm: -> Network Connection : 'PrivateEndpoint'
23:45:13 ==> azure-arm: -> IP Address : '10.82.72.22'
23:45:13 ==> azure-arm: Waiting for SSH to become available...
23:46:22 ==> azure-arm: Error waiting for SSH: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
23:46:22 ==> azure-arm:
23:46:22 ==> azure-arm: The resource group was not created by Packer, deleting individual resources ...
23:46:22 ==> azure-arm: -> Deployment: pkrdpb3hlqt7gvk
23:46:25 ==> azure-arm: -> Microsoft.Compute/virtualMachines : 'pkrvmb3hlqt7gvk'
23:48:44 ==> azure-arm: -> Microsoft.Network/networkInterfaces : 'pkrnib3hlqt7gvk'

refer this link _Packer communicator_:

To solve this problem, you can add these two parameters (ssh_username and ssh_private_key_file) to the builders block

{
"type": "azure-arm",
"client_id": "{{user azure_client_id}}",
"client_secret": "{{user azure_client_secret}}",
"subscription_id": "{{user azure_subscription_id}}",

"managed_image_resource_group_name": "production-eastus",
"managed_image_name": "{{user stack}}-{{user image_tag}}",
"os_type": "Linux",
"custom_managed_image_name": "base-2018-01-09-1515504716",
"custom_managed_image_resource_group_name": "production-eastus",

"ssh_username": "{{user ssh_user}}",
"ssh_private_key_file": "{{user ssh_private_key_file}}",

"azure_tags": {
"OS_Version": "Centos",
"Release": "Latest",
"Version": "{{user version}}",
"Created": "{{isotime}}",
"Vendor": "{{user vendor}}",
"Vendor_Url": "{{user vendor_url}}",
"Product": "{{user product}}",
"Stack": "{{user stack}}",
"Commit_ID": "{{user commit_id}}"
},

"location": "East US",
"vm_size": "Standard_A2"
}

@youhong316 did your image :" _base-2018-01-09-1515504716_" already have the user that you define in "_ssh_user_" was part of the base image and the public key for it added to its "_authorized_keys_".

I had to create a new VM and add public key to the authorized_keys for one of the account. Then the private key entry in the packer json file worked.

I'm going to lock this issue because it has been closed for _30 days_ โณ. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

wduncanfraser picture wduncanfraser  ยท  3Comments

sourav82 picture sourav82  ยท  3Comments

mwhooker picture mwhooker  ยท  3Comments

PartyImp picture PartyImp  ยท  3Comments

mushon4 picture mushon4  ยท  3Comments