Packer: vmware-iso build remote on ESXi Cannot mount VMware tools ISO

Created on 18 Oct 2016  ·  15Comments  ·  Source: hashicorp/packer

Using the vmware-iso build with remote_type esx5, VMware Tools installation fails. The builder tries to install it using vim-cmd vmsvc/tools.install <vmId>, but this fails because the OS installation CD is already attached/inserted.

If you run packer in debug mode, log into the VM and unmount the attached ISO then the VMware Tools installation works. This lead me to #1795, which I re-opened, and was directed to open a new bug.

If I install VMware Tools using a provisioning shell script, from an uploaded ISO that is mounted on a loopback, because this ToolsInstall step of the driver fails the build fails at the end.

If I modify the packer source to return nil rather than executing the vim-cmd, and have the provisioning shell script install VMware Tools then the build fails due to another, different, issue (#4006 when a script shuts down the VM and the SSH connection terminates abruptly).

starting remote command: echo 'vagrant'|sudo -S shutdown -h now
Remote command exited without exit status or exit signal
...
Build 'vmware-iso' errorred: Shutdown command has non-zero exit status
buildevmware-esxi

Most helpful comment

I am using Packer 0.12.3 and it seems that it was able to connect the vmware tools for my Windows Guest.

I had to setup a __33m__ __boot_wait__ since that is what it takes to run the __Autounattend.xml__ in my case with static-ip configuration.

screen shot 2017-03-17 at 1 19 22 pm

All 15 comments

This is hard as the machine is not networked.

Seems unfair that I should spend hours on reproducing this if you can't even copy the log onto a usb thumbdrive etc and gist with a link here.

Master is currently somewhat broken due to #4006 but applying the following patch will get you past that:

diff --git a/communicator/ssh/communicator.go b/communicator/ssh/communicator.go
index 2895434..2c95f0e 100644
--- a/communicator/ssh/communicator.go
+++ b/communicator/ssh/communicator.go
@@ -118,7 +118,7 @@ func (c *comm) Start(cmd *packer.RemoteCmd) (err error) {
                                log.Printf("Remote command exited with '%d': %s", exitStatus, cmd.Command)
                        case *ssh.ExitMissingError:
                                log.Printf("Remote command exited without exit status or exit signal.")
-                               exitStatus = -1
+                               exitStatus = 0
                        default:
                                log.Printf("Error occurred waiting for ssh session: %s", err.Error())
                                exitStatus = -1

I'm going to edit with the data when I get it.

I've already made the same patch to work around #4006, but it failed again due to a vagrant post-processor trying that tried to rm -rf /vmfs/volumes/datastore1/output-macose1012-vmware-iso which failed with: Post-processor failed: open /vmfs/volumes/datastore1/output-macos1012-vmware-iso/disk-flat.vmdk: no such file or directory. I've removed the vagrant post-processor and I'm trying again. It takes about 30 minutes each attempt to build a VM. Once this attempt finishes regardless of it working or not I'll revert my ToolsInstall return nil patch and capture the logs for you and upload gists of the templates etc.

vagrant post-processor is not supported for VMware ESXi until #3967

I've edited the issue description with links to a gist I uploaded that has the logs from a stock packer v0.10.2.

The main parts are: -

2016/10/17 06:47:26 packer.orig: 2016/10/17 06:47:26 starting remote command: vim-cmd vmsvc/tools.install 128
2016/10/17 06:47:27 packer.orig: 2016/10/17 06:47:27 remote command exited with '1': vim-cmd vmsvc/tools.install 128
Build 'vmware-iso' errored: Couldn't mount VMware tools ISO. Please check the 'guest_os_type' in your template.json.

==> Some builds didn't complete successfully and had errors:
--> vmware-iso: Couldn't mount VMware tools ISO. Please check the 'guest_os_type' in your template.json.

==> Builds finished but no artifacts were created.
2016/10/17 07:00:44 ui error: Build 'vmware-iso' errored: Couldn't mount VMware tools ISO. Please check the 'guest_os_type' in your template.json.
2016/10/17 07:00:44 Builds completed. Waiting on interrupt barrier...
2016/10/17 07:00:44 machine readable: error-count []string{"1"}
2016/10/17 07:00:44 ui error:
==> Some builds didn't complete successfully and had errors:
2016/10/17 07:00:44 machine readable: vmware-iso,error []string{"Couldn't mount VMware tools ISO. Please check the 'guest_os_type' in your template.json."}
2016/10/17 07:00:44 ui error: --> vmware-iso: Couldn't mount VMware tools ISO. Please check the 'guest_os_type' in your template.json.
2016/10/17 07:00:44 ui:
==> Builds finished but no artifacts were created.

This is the main issue. I worked around it by patching the driver: -

diff --git a/builder/vmware/iso/driver_esx5.go b/builder/vmware/iso/driver_esx5.go
index 389f812..5e3d930 100644
--- a/builder/vmware/iso/driver_esx5.go
+++ b/builder/vmware/iso/driver_esx5.go
@@ -140,7 +140,8 @@ func (d *ESX5Driver) ToolsIsoPath(string) string {
 }

 func (d *ESX5Driver) ToolsInstall() error {
-       return d.sh("vim-cmd", "vmsvc/tools.install", d.vmId)
+       //return d.sh("vim-cmd", "vmsvc/tools.install", d.vmId)
+       return nil
 }

With my modified packer, 0.11.0.dev, after removing the vagrant post processor step I was able to successfully build a VMware VM for the first time. Thanks for the reference to #3967. I checked and #3967 is in the source my patches were applied against. Maybe in macos.json I need to change the post-processor type from vagrant to vmware-esx or vmware. I would like to fix this, but it's a separate issue, if need be I'll log a second bug, but I've only just hit it and didn't look into it closely so I'm not sure if I'm doing something wrong yet.

I believe I've provided all the requested information.
Can you please remove the need-more-info label.

I logged #4056 for the Vagrant Post-Processor issue I referenced above.

👍 is there a workaround without recompiling? I guess for now the vmware-iso on esx5 needs to be installed separately(shell provisioner?)

Yes vmware tools needs to be installed separately, for Linux based distros you can install open-vmtools.

I am using Packer 0.12.3 and it seems that it was able to connect the vmware tools for my Windows Guest.

I had to setup a __33m__ __boot_wait__ since that is what it takes to run the __Autounattend.xml__ in my case with static-ip configuration.

screen shot 2017-03-17 at 1 19 22 pm

Thanks for the information Luis. I'm not sure the same will work for Linux
distros, I'll need to do more testing/experimenting.

On Sat, Mar 18, 2017 at 4:25 AM, Luis Mayorga notifications@github.com
wrote:

I am using Packer 0.12.3 and it seems that it was able to connect the
vmware tools for my Windows Guest.

I had to setup a 33m boot_wait since that is what it takes to run the
Autounattend.xml in my case with static-ip configuration.

[image: screen shot 2017-03-17 at 1 19 22 pm]
https://cloud.githubusercontent.com/assets/3172378/24054939/7c5d7e68-0b14-11e7-96ee-78902212d9dc.png


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/mitchellh/packer/issues/4022#issuecomment-287418715,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ADE_IpdsIgyDGdhZ-gB2kG50ISJeoW58ks5rmsIGgaJpZM4KZacY
.

Created an isolated esx 5.5 environment independent from vcenter and will run both tests for Windows and Linux. Will provide more info soon.

Is this still an issue? I'm planning an esxi push for myself next week and want to revisit some of these old issues if necessary.

in my case, setting "tools_upload_flavor": windows got the VMWare tools to mount after the Windows installation completes on packer 1.5.1 with remote esxi. However, it doesn't automatically install the VMWare tools.

Details to complete installation are here: https://github.com/chef/bento/issues/1222#issuecomment-585932633

Ok, if the tools seem to mount, that's good enough for me. Closing this.

I'm going to lock this issue because it has been closed for _30 days_ ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

Was this page helpful?
0 / 5 - 0 ratings