Tested with v0.9.2 and v0.9.3
resource "digitalocean_droplet" "xxx" {
name = "${var.do_name}"
image = "${var.do_image}"
region = "${var.do_region}"
size = "${var.do_size}"
ssh_keys = "${var.do_ssh_keys}"
provisioner "remote-exec" {
inline = [...]
connection {
type = "ssh"
user = "${var.ssh_user}"
private_key = "${file(var.ssh_key)}"
agent = true
timeout = "1m"
}
}
}
tail -23 infra.log outputs
....
2017/04/18 06:26:03 [DEBUG] root: eval: *terraform.EvalWriteState
2017/04/18 06:26:03 [DEBUG] root: eval: *terraform.EvalApplyProvisioners
2017/04/18 06:26:03 [DEBUG] root: eval: *terraform.EvalIf
2017/04/18 06:26:03 [DEBUG] root: eval: *terraform.EvalWriteState
2017/04/18 06:26:03 [DEBUG] root: eval: *terraform.EvalWriteDiff
2017/04/18 06:26:03 [DEBUG] root: eval: *terraform.EvalApplyPost
2017/04/18 06:26:03 [ERROR] root: eval: *terraform.EvalApplyPost, err: 1 error(s) occurred:
* Failed to read key "-----BEGIN RSA PRIVATE KEY-----\n ...............\n-----END RSA PRIVATE KEY-----\n": password protected keys are
not supported. Please decrypt the key prior to use.
2017/04/18 06:26:03 [ERROR] root: eval: *terraform.EvalSequence, err: 1 error(s) occurred:
* Failed to read key "-----BEGIN RSA PRIVATE KEY-----\n.............2017/04/18 06:26:03 [DEBUG] dag/walk: upstream errored, not walking "digitalocean_record.owner"
2017/04/18 06:26:03 [DEBUG] dag/walk: upstream errored, not walking "digitalocean_record.admin"
2017/04/18 06:26:03 [DEBUG] dag/walk: upstream errored, not walking "digitalocean_record.XXXX"
2017/04/18 06:26:03 [DEBUG] dag/walk: upstream errored, not walking "meta.count-boundary (count boundary fixup)"
2017/04/18 06:26:03 [DEBUG] plugin: waiting for all plugin processes to complete...
2017/04/18 06:26:03 [DEBUG] plugin: terraform: local-exec-provisioner (internal) 2017/04/18 06:26:03 [DEBUG] plugin: waiting for all plugin processes to complete...
2017/04/18 06:26:03 [DEBUG] plugin: terraform: remote-exec-provisioner (internal) 2017/04/18 06:26:03 [DEBUG] plugin: waiting for all plugin processes to complete...
2017/04/18 06:26:03 [DEBUG] plugin: terraform: digitalocean-provider (internal) 2017/04/18 06:26:03 [DEBUG] plugin: waiting for all plugin processes to complete...
2017/04/18 06:26:03 [DEBUG] plugin: /home/linuxer/compilethegoodness/terraform/terraform: plugin process exited
2017/04/18 06:26:03 [DEBUG] plugin: /home/linuxer/compilethegoodness/terraform/terraform: plugin process exited
2017/04/18 06:26:03 [DEBUG] plugin: /home/linuxer/compilethegoodness/terraform/terraform: plugin process exited
It should have connected to the remote host
It did not.
terraform apply@IOAyman You need to use the ssh key id of your public key
https://www.terraform.io/docs/providers/do/r/droplet.html#ssh_keys
Upload your public key using this resource
https://www.terraform.io/docs/providers/do/r/ssh_key.html
@willejs, yes I have already done that before. As you might see in the code snippet, I am passing the key ID in a variable
ssh_keys = "${var.do_ssh_keys}"
Thanks though :)
Hi @IOAyman,
The actual failure here is shown in the log messages:
password protected keys are not supported. Please decrypt the key prior to use
While you can decrypt the private key to load it, it is preferable to have your ssh agent handle this for you.
Hello @jbardin,
Actually, I have forgotten mention that my key is already decrypted using ssh-agent (already used it to login elsewhere).
In that case, does it work if you remove the private_key field from the configuration? The private_key failure may be causing all auth to fail, preventing the key from being loaded from the agent.
@IOAyman specifying private_key means that it will attempt to load that file in the ssh invocation - using this key first; since it can't use an encrypted key it fails before it gets to your agent.
As @jbardin says, if you want to use encrypted keys then you'll have to rely on your agent; otherwise provide a path/file to a key that is decrypted
openssl rsa -in ~/.ssh/_KEY_ -out ~/.ssh/_KEY_.insecure && chmod 0400 ~/.ssh/_KEY_.insecure
@jbardin, You were right! I did as you said; I commented out the private_key field and it worked. Thanks!
Thank you as well @mengesb :)
But it would've been better if it looked at the agent before failing all auth. Try all options before failing everything :/
@IOAyman,
Thanks for the feedback! I'll look into whether this can be avoided in the code, or if we just need to better document the behavior.
I think the private_key and agent options could be treated as mutually exclusive (conflicting) here, since there's not really a good reason to provide them both at once. I expect that's not done already just because the way connection blocks are parsed is a bit complex, spread over a few different layers of Terraform's internal architecture. But I think it should be doable.
I'm getting the same password protected keys are not supported. Please decrypt the key prior to use. error with the following (with Scaleway as provider):
resource "scaleway_ip" "ip" {
server = "${scaleway_server.mywebserver.id}"
}
resource "scaleway_server" "mywebserver" {
name = "mywebserver"
image = "${data.scaleway_image.ubuntu.id}"
type = "VC1S"
provisioner "remote-exec" {
inline = [...]
connection {
type = "ssh"
private_key = "${file("~/.ssh/id_rsa")}"
user = "root"
timeout = "2m"
}
}
}
Commenting out the private_key field does not work, SSH connection fails.
ssh-add -l shows the correct key added to the authentication agent.
I am using version 0.9.8.
Am I doing anything wrong there?
Am I doing anything wrong there?
Yes... :smile:
Looking at your code block; i don't see reference to a host... so I don't think it knows what it's connecting to. Also, just as a point, type defaults to ssh, and user defaults to root, so these lines are not necessary.
Is it reasonable @apparentlymart to update code connection blocks to REQUIRE the host attribute?
You could always insecure the ssh key; refer to ssh-keygen for that syntax and protect that key via other methods like LUKS encryption store or some other method. I tend to do it at-the-time, and destroy the unprotected key once done or simply use ssh-agent (I have circumstances where ssh-agent is disabled which is why i do this)
@mengesb Thanks for the feedback. However when I comment out the private_key field it tries to connect to the host but it times out, so somehow it seems to know which host to connect to?
I also noticed that I forgot to add agent = true. Is that problematic?
Adding agent = true did not solve the issue but I actually looked at the Scaleway control panel and noticed that the server does not have a public IP attached to it. Hence why it fails to connect via SSH.
Most helpful comment
In that case, does it work if you remove the
private_keyfield from the configuration? Theprivate_keyfailure may be causing all auth to fail, preventing the key from being loaded from the agent.