0.7.1
resource "aws_s3_bucket" "bucket_logs" {
bucket = "${lower(var.platform)}-${lower(var.environment)}-logs"
acl = "log-delivery-write"
force_destroy = true
tags {
Name = "${var.platform}_${var.environment}_Bucket_Logs"
Platform = "${var.platform}"
Owner = "${var.owner}"
Environment = "${var.environment}"
}
}
resource "aws_s3_bucket" "bucket_file_upload" {
bucket = "${lower(var.platform)}-${lower(var.environment)}-file-upload"
acl = "private"
force_destroy = true
logging {
target_bucket = "${aws_s3_bucket.bucket_logs.id}"
target_prefix = "s3-file-upload/"
}
tags {
Name = "${var.platform}_${var.environment}_Bucket_File_Upload"
Platform = "${var.platform}"
Owner = "${var.owner}"
Environment = "${var.environment}"
}
}
On an existing bucket, I added the logging block. I expected the bucket to be updated.
The bucket would not update its logging options until I destroyed the bucket and recreated it via terraform apply
terraform applyterraform apply@elmundio87 Hi i am unable to reproduce it i am using
before logging
resource "aws_s3_bucket" "log_bucket" {
bucket = "my_tf_log_bucket_8988E221ABEE6AA4"
acl = "log-delivery-write"
}
resource "aws_s3_bucket" "b" {
bucket = "my_tf_test_bucket_8988E221ABEE6AA4"
acl = "private"
}
logging
resource "aws_s3_bucket" "log_bucket" {
bucket = "my_tf_log_bucket_8988E221ABEE6AA4"
acl = "log-delivery-write"
}
resource "aws_s3_bucket" "b" {
bucket = "my_tf_test_bucket_8988E221ABEE6AA4"
acl = "private"
logging {
target_bucket = "${aws_s3_bucket.log_bucket.id}"
target_prefix = "log/"
}
}
and it's working as expected.
➜ terraform-debug $GOPATH/bin/terraform apply
2016/09/02 10:56:10 [INFO] Terraform version: 0.7.3 dev 96f1aff69365a5ff2d2f56f8f9260e495992d9b5
2016/09/02 10:56:10 [INFO] CLI args: []string{"/Users/anssharma/Code/go/bin/terraform", "apply"}
2016/09/02 10:56:10 [DEBUG] Detected home directory from env var: /Users/anssharma
aws_s3_bucket.log_bucket: Refreshing state... (ID: my_tf_log_bucket_8988E221ABEE6AA4)
aws_s3_bucket.b: Refreshing state... (ID: my_tf_test_bucket_8988E221ABEE6AA4)
aws_s3_bucket.b: Modifying...
logging.#: "0" => "1"
logging.4077926422.target_bucket: "" => "my_tf_log_bucket_8988E221ABEE6AA4"
logging.4077926422.target_prefix: "" => "log/"
aws_s3_bucket.b: Still modifying... (10s elapsed)
aws_s3_bucket.b: Modifications complete
I might be having the same problem, or a similar one:
I can confirm, same exact issue that @steveh had. I forgot to set acl = "log-delivery-write" for the target bucket and terraform apply failed, but still saved a successful logging = 1 state in the state file.
Any follow-on plan or apply said no changes, but source bucket was not configured for logging.
After cleaning up state and running apply again with proper ACL it works as expected.
Terraform v0.7.13
I can confirm this is still a problem in terraform 0.9.2
I ran into this recently as well, and updated my local fork to address it. I've submitted PR #13281 with the fix. I hope that helps.
I merged #13281, thanks!
I'm going to lock this issue because it has been closed for _30 days_ ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.