I'm using bitbucket pipelines to deployed EKS and other infrastructure into AWS using terraform. When running these steps locally in the exact same docker image it works fine:
$ docker run -it -v $PWD:/code --entrypoint sh --memory=4g --memory-swap=4g --memory-swappiness=0 hashicorp/terraform@sha256:5e19b9bab0b6d079cae8822be22cd7010f65177356600154b77fc4fc81bdde31
$ export AWS_ACCESS_KEY_ID=<...>
$ export AWS_SECRET_ACCESS_KEY=<...>
$ cd /code/envs/dev
$ terraform init
$ terraform plan --var-file default.tfvars
However, when running this in the pipeline with the following step I get the error
Error: Failed to configure: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
step:
name: Plan for dev
image: hashicorp/terraform@sha256:5e19b9bab0b6d079cae8822be22cd7010f65177356600154b77fc4fc81bdde31 # 0.12.20
script:
- cd envs/dev
- terraform init
- terraform plan --var-file default.tfvars
services:
- docker
Note. The AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are set in the pipeline variables
Any ideas how to reproduce this exactly locally or solve the issue?
---
So I've figured out what the issue was and a temporary solution. Seems like the bitbucket pipelines must be running in kubernetes pods as it seems that the kubernetes service links are being injected as environment variables.
The service variables e.g. KUBERNETES_PORT, KUBERNETES_SERVICE_PORT, KUBERNETES_PORT_443_TCP_ADDR etc must trigger a different path in the terraform EKS module.
I would suggest that the bitbucket pipelines team switch this off in the PodSpec using the variable `enableServiceLinks`. See more documentation https://kubernetes.io/docs/reference/generated/kubernetes-api/v1.14/#pod-v1-core
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.