Granting IAM User Access to Existing EKS Cluster: A Step-by-Step Guide

Abhimanyubajaj
3 min readApr 3, 2023

Setting up a Kubernetes cluster using Terraform is a relatively easy task that we discussed in our previous article, which can be found here: https://medium.com/@abhimanyubajaj98/deploying-kubernetes-from-scratch-with-terraform-a-step-by-step-guide-7d628910efd0

However, as you scale, it becomes important to have multiple users with access to the cluster. By default, the user who created the EKS cluster will have full control. However, creating a new user and giving them permission for EKS or even administrator permission won’t be enough for them to access the already created cluster. To give a new user access to the cluster, follow the steps below:

  1. Create a new user and provide them with the access key and secret key. Make sure they add the AccessKey and SecretKey in their ~/.aws/config file.
  2. Access the cluster with the user who created it. To confirm if you are at the right cluster, run
kubectl config current-context

If you are the creator of the cluster and accessing the cluster for the first time, run the following command:

aws eks update-kubeconfig — region <region> — name <name-of-your-cluster> — profile <aws-profile>

This will add the cluster to your context. Run the command in the screenshot above afterwards, and you will see the region, account ID, and cluster name.

3. Modify the aws-auth configmap in your kube-system namespace. To do that, first run the command:

kubectl -n kube-system get configmap aws-auth -o yaml > aws-auth.yaml

This will create a file called aws-auth.yaml in your pwd. This yaml file will be similar to the one in below screenshot->

To add a new User, We will add below section in this yaml->

mapUsers: |
— userarn: arn:aws:iam::<Account-id>:user/<User-name>
username: <User-name>
groups:
— system:masters

Once you add this, your aws-auth.yaml file will look something like->

Once you’ve made these changes to the aws-auth.yaml file. Run the aws eks update-kubeconfig command again

aws eks update-kubeconfig — region <region> — name <name-of-your-cluster> — profile <aws-profile>

This is it!! You can have your new User run the same command

aws eks update-kubeconfig — region <region> — name <name-of-your-cluster> — profile <aws-profile>

This will add the cluster in their kubectl config located at ~/.kube/config

For more information or in case you get stuck, Feel free to connect with me on LinkedIn

https://www.linkedin.com/in/theabhibajaj/

--

--

Abhimanyubajaj

I solve problems. CKAD, CKA, Azure, AWS, GCP, Terraform Certified. Senior Software Engineer at Cisco.