Terraform v0.7.0
Probably all of AWS, observed with S3.
variable "region" {
default = "us-west-2"
}
provider "aws" {
region = "${var.region}"
profile = "fake_profile"
}
resource "aws_s3_bucket" "bucket" {
bucket = "fakebucket-something-test-1"
acl = "private"
}
https://gist.github.com/boompig/f05871140b928ae02b8f835d745158ac
Should successfully login then give "noop" text.
Does not read the correct profile from environment variable. Works if you provide the profile name in the file, though.
your_real_profileterraform applyIt's working for me under 0.7.1 with no profile set in provider block. When AWS_PROFILE is not set, it seems to pick a 'random' profile from ~/.aws/credentials
~/.aws/credentials:
[prod]
aws_access_key_id = ....
aws_secret_access_key = ...
[dev]
aws_access_key_id = ....
aws_secret_access_key = ...
prod has AWS account id 999888666555
dev has AWS account id 123456789012
provider "aws" {
allowed_account_ids = ["123456789012"]
}
$ AWS_PROFILE=dev terraform plan
$ echo $?
0
$ AWS_PROFILE=prod terraform plan
$ echo $?
1
$ terraform plan
.... Same out put as AWS_PROFILE=prod
$ echo $?
1
The same issue with v0.7.7, it just shows default one.
File ~/.aws/config:
[default]
region = us-west-2
aws_access_key_id = ...
aws_secret_access_key =...
[au]
region=ap-southeast-2
Sample .tf:
provider "aws" {
shared_credentials_file = "~/.aws/config"
profile = "au"
}
resource "aws_vpc" "ozzieVPC" {
...
}
Same problem here with 0.8.0-beta2.
still an issue, any update on a fix?
Same problem here with the latest 0.8.7
Also having this issue with v0.9.2.
Can't use TF with assume role configured via environment variables due to this which makes kubernetes deployments with kops difficult when using TF as the provisioner (it outputs provider "aws" {} and expects env var configuration).
Having same issue with 0.9.6
Having same issue with 0.9.8
Having the same issue with 0.9.8 . In fact , how do we read environment variables in a tfvar file
foo = "${env.FOO}"
does not work
Uhh , for now , having this work around like below in tfvar file
foo = "@JENKINS_PARAM_FOO@"
doo = "@JENKINS_PARAM_DOO@"
then , just before we run terraform plan , having a sed cmd like below to replace the defined keys with environment variables set by jenkins .
sed -i "s;@JENKINS_PARAM_FOO@;$JENKINS_PARAM_FOO;g" ./terraform/dev.tfvars
sed -i "s;@JENKINS_PARAM_DOO@;$JENKINS_PARAM_DOO;g" ./terraform/dev.tfvars
// once the tfvars file has been replaced with needed vars , then apply cmd
terraform plan -state="dev.tfstate" -var-file="dev.tfvars"
But , this file with variables replaced cant stay in jenkins workspace ( especially if we are reading aws keys or rds passwords . it has to be deleted once plan is successful )
so added
#!/bin/bash
cp dev.tfvars jenkins_dev.tfvars
//replace tfvars with environment variables
sed -i "s;@JENKINS_PARAM_FOO@;$JENKINS_PARAM_FOO;g" ./terraform/jenkins_dev.tfvars
sed -i "s;@JENKINS_PARAM_DOO@;$JENKINS_PARAM_DOO;g" ./terraform/jenkins_dev.tfvars
// use the new file with replaced vars to plan and apply
terraform plan -state="dev.tfstate" -var-file="jenkins_dev.tfvars"
// remove the var file with passwords once done with plan and apply
rm -f ./terraform/jenkins_dev.tfvars
@usowmyas there are a couple of ways to handle different credential pair in Jenkins. cloudbees plugin and have terraform code something like this:
variable "access_key" {
description = "access key for subaccount"
}
variable "secret_key" {
description = "secret key for subaccount"
}
variable "region" {
default = "eu-west-1"
}
provider "aws" {
access_key = "${var.access_key}"
secret_key = "${var.secret_key}"
region = "${var.region}"
forbidden_account_ids = ["940226765273"]
}
And pass during terraform operation like below on Jenkins
terraform plan -var access_key="${TEST_AWS_ACCESS_KEY_ID}" -var secret_key="${TEST_SECRET_ACCESS_KEY}"
but , if i have several variables , terraform plan -var ( say > 12 variables ) , we get below error
Too many command line arguments. Configuration path expected.
and looks like we cant use a combination of -var and -var-file .
Also , i tried using Jenkins credentials plugin , we are in fact saving the password . The idea is to have the user enter his AWS credentials / passwords when a Jenkins task is triggered .
well at least for more secure stages ( prod ) , if not for dev , test stages . i guess Jenkins credentials plugin can be used for dev and test stages
Hi all,
This issue has been closed and migrated to the AWS provider repository and is no longer being monitored. You should find the link to the new issue above.
I'm going to lock this issue because it has been closed for _30 days_ โณ. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.
Most helpful comment
still an issue, any update on a fix?