Eksctl: ssh key not attached to ec2 worker node

Created on 26 Sep 2018  Â·  8Comments  Â·  Source: weaveworks/eksctl

What happened?
eksctl does not attach the requested ssh key to the worker nodes.

What you expected to happen?
eksctl correctly attaches the requested ssh key to the worker nodes.

How to reproduce it?

  • create a key pair in EC2 named 'eks'
  • run eksctl -v5 create cluster --name=eks-cluster --region=eu-west-1 --ssh-public-key=eks

Versions
Please paste in the output of these commands:

$ eksctl version
2018-09-26T14:58:33+01:00 [ℹ]  versionInfo = map[string]string{"builtAt":"2018-09-19T16:12:09Z", "gitCommit":"e577962c2a3d25af5dcc7995340339c4cdfee804", "gitTag":"0.1.3"}

$ uname -a
Linux luigi-vm 4.15.0-34-generic #37-Ubuntu SMP Mon Aug 27 15:21:48 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux

$ kubectl version
Client Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.3", GitCommit:"a4529464e4629c21224b3d52edfe0ea91b072862", GitTreeState:"clean", BuildDate:"2018-09-09T18:02:47Z", GoVersion:"go1.10.3", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"10+", GitVersion:"v1.10.3-eks", GitCommit:"58c199a59046dbf0a13a387d3491a39213be53df", GitTreeState:"clean", BuildDate:"2018-09-21T21:00:04Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"}
good first issue help wanted kinbug kindocs

Most helpful comment

The current behaviour is like this:

By default, --ssh-access is false. If you enable it (--ssh-access=true), your ~/.ssh/id_rsa.pub will get imported and attached to the instances and port 22 open for any source address.

You can supply custom key path, e.g. --ssh-public-key=~/.ssh/eks.pub or name of an existing key (as you shown above --ssh-public-key=eks). But setting --ssh-public-key doesn't automatically toggle --ssh-access. So you just need to do add --ssh-access=true.

However, we should discuss whether having set --ssh-public-key one should expect --ssh-access enabled also.

@richardcase @karinnainiguez @kschumy @christopherhein what are your thoughts on this?

All 8 comments

The current behaviour is like this:

By default, --ssh-access is false. If you enable it (--ssh-access=true), your ~/.ssh/id_rsa.pub will get imported and attached to the instances and port 22 open for any source address.

You can supply custom key path, e.g. --ssh-public-key=~/.ssh/eks.pub or name of an existing key (as you shown above --ssh-public-key=eks). But setting --ssh-public-key doesn't automatically toggle --ssh-access. So you just need to do add --ssh-access=true.

However, we should discuss whether having set --ssh-public-key one should expect --ssh-access enabled also.

@richardcase @karinnainiguez @kschumy @christopherhein what are your thoughts on this?

thanks @errordeveloper but please let's always update the online documentation.

@errordeveloper i don't necessarily think that by specifying --ssh-public-key that we should then automatically enable or expect --ssh-access=true.

This may be a crazy idea but could we:

  • Display a warning/info message if --ssh-public-key is set and if --ssh-access=false.
  • Add a command so that ssh access can be enabled and disabled.

This tripped me up as well, I was just following the online eksctl homepage examples. Also I feel the the help output might need a bit more information on it, such as its a boolean value.

eksctl version                                                                                                                                   
2018-10-05T19:23:06-05:00 [i]  versionInfo = map[string]string{"builtAt":"2018-09-12T15:01:34Z", "gitCommit":"2f553a2c54eb1390a7eb6746ccdc5da106fe518b", "gitTag":"0.1.2"}


eksctl create cluster --help
...
      --ssh-access               control SSH access for nodes
      --ssh-public-key string    SSH public key to use for nodes (import from local path, or use existing EC2 key pair) (default "~/.ssh/id_rsa.pub")
...

Display a warning/info message if --ssh-public-key is set and if --ssh-access=false.

Sounds like a good idea to me! We should also update the docs (per @kylesloan's comment, it sounds like docs are a little misleading at the moment).

Add a command so that ssh access can be enabled and disabled.

I am not sure, perhaps eventually we could do this, but it's not very easy for an existing nodegroup, as nodes won't just start using a key you've added. I think we should allow adding SSH for separate nodegroups, when we have support for more than one. Also, we should tackle #148 first :)

This default behaviour can be euphemistically described as "misleading". It should be obvious that i need ssh access, when specifying ssh key.

As we have a default value, and SSH is not enabled by default, we will have trigger enabling SSH when a value is set that is not the same is default default.

What I mean is:

  1. eksctl create cluster|nodegroup has SSH disabled
  2. eksctl create cluster|nodegroup --allow-ssh enables SSH and used default key (~/.ssh/id_rsa.pub)
  3. eksctl create cluster|nodegroup --ssh-public-key=~/.ssh/eks_id_rsa.pub enables SSH and uses the given key
  4. eksctl create cluster|nodegroup --ssh-public-key=~/.ssh/eks_id_rsa.pub --allow-ssh (same as above)

Alternatively, as @richardcase suggested, we could at least warn in case of 3 that access won't be enabled.

Add a command so that ssh access can be enabled and disabled.

@richardcase i think that's no as useful now, as one can add a new nodegroup that has SSH access when they need to (e.g. to debug an issue with custom AMI or install/run some ad-hoc software on their nodes).

This was fixed via #657.

Was this page helpful?
0 / 5 - 0 ratings