0.7.13
resource "aws_kms_key" "my_kms_key" {
description = "My KMS Key"
enable_key_rotation = "true"
policy = <<HEREDOC
{
[my policy]
}
HEREDOC
}
Creating the resource on the first run, nothing on subsequent runs.
Resource is modified every time, even though no changes to it have been done.
module.core-main.aws_kms_key.my_kms_key: Modifying...
policy: "{[my policy]\n}\n"
module.core-main.aws_kms_key.my_kms_key: Modifications complete
terraform applyHi @FransUrbo
Sorry for the issue here, we have a nightly test that has this as the test code:
var kmsTimestamp = time.Now().Format(time.RFC1123)
var testAccAWSKmsKey = fmt.Sprintf(`
resource "aws_kms_key" "foo" {
description = "Terraform acc test %s"
deletion_window_in_days = 7
policy = <<POLICY
{
"Version": "2012-10-17",
"Id": "kms-tf-1",
"Statement": [
{
"Sid": "Enable IAM User Permissions",
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "kms:*",
"Resource": "*"
}
]
}
POLICY
}`, kmsTimestamp)
This doesn't run a perpetual diff - any chance you could share what your policy is (minus secrets) and we can try and get a recreation?
thanks
Paul
Experiencing the same issue on v0.7.13, here is my resource with policy:
resource "aws_kms_key" "s3_bucket" {
description = "bucket encryption key"
deletion_window_in_days = 30
policy = <<EOF
{
"Version": "2012-10-17",
"Id": "key-consolepolicy-2",
"Statement": [
{
"Sid": "Enable IAM User Permissions",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::123456789000:root"
},
"Action": "kms:*",
"Resource": "*"
},
{
"Sid": "Allow access for Key Administrators",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::123456789000:user/user1",
"arn:aws:iam::123456789000:user/user2"
]
},
"Action": [
"kms:Create*",
"kms:Describe*",
"kms:Enable*",
"kms:List*",
"kms:Put*",
"kms:Update*",
"kms:Revoke*",
"kms:Disable*",
"kms:Get*",
"kms:Delete*",
"kms:ScheduleKeyDeletion",
"kms:CancelKeyDeletion"
],
"Resource": "*"
},
{
"Sid": "Allow use of the key",
"Effect": "Allow",
"Principal": {
"AWS": []
},
"Action": [
"kms:Encrypt",
"kms:Decrypt",
"kms:ReEncrypt*",
"kms:GenerateDataKey*",
"kms:DescribeKey"
],
"Resource": "*"
},
{
"Sid": "Allow attachment of persistent resources",
"Effect": "Allow",
"Principal": {
"AWS": []
},
"Action": [
"kms:CreateGrant",
"kms:ListGrants",
"kms:RevokeGrant"
],
"Resource": "*",
"Condition": {
"Bool": {
"kms:GrantIsForAWSResource": "true"
}
}
}
]
}
EOF
}
output "s3_bucket_arn" {
value = "${aws_kms_key.s3_bucket.arn}"
}
output "s3_bucket_key_id" {
value = "${aws_kms_key.s3_bucket.key_id}"
}
policy makes it to the state file but forces a change on every apply. Thanks
We may be seeing this as well on TF 0.8.1. Seeing keys get placed in the Pending Deletion state on some runs at random. We do not have key rotation enabled either.
Mine looks almost identical to @makered's example.
I'm seeing this same issue on 0.8.7:
~ module.vault.aws_kms_key.vault_root
policy: "{\"Statement\":[{\"Action\":[\"kms:ReEncrypt*\",\"kms:GenerateDataKey*\",\"kms:Encrypt\",\"kms:DescribeKey\",\"kms:Decrypt\"],\"Effect\":\"Allow\",\"Principal\":{\"AWS\":\"arn:aws:iam::[redacted]:role/outreach-vault-role-staging2\"},\"Resource\":\"*\",\"Sid\":\"1\"},{\"Action\":\"kms:*\",\"Effect\":\"Allow\",\"Principal\":{\"AWS\":\"arn:aws:iam::[redacted]:root\"},\"Resource\":\"*\",\"Sid\":\"AllowAdministration\"}],\"Version\":\"2012-10-17\"}"
=> "{\n \"Version\": \"2012-10-17\",\n \"Statement\": [\n {\n \"Sid\": \"1\",\n \"Effect\": \"Allow\",\n \"Action\": [\n \"kms:ReEncrypt*\",\n \"kms:GenerateDataKey*\",\n \"kms:Encrypt\",\n \"kms:DescribeKey\",\n \"kms:Decrypt\"\n ],\n \"Resource\": \"*\",\n \"Principal\": {\n \"AWS\": \"arn:aws:iam::[redacted]:role/outreach-vault-role-staging2\"\n }\n },\n {\n \"Sid\": \"AllowAdministration\",\n \"Effect\": \"Allow\",\n \"Action\": \"kms:*\",\n \"Resource\": \"*\",\n \"Principal\": {\n \"AWS\": \"arn:aws:sts::[redacted]:root\"\n }\n }\n ]\n}"
It seems that the aws cli is pulling down the policy json with the Version in the end and then the JSON marshalling of the policy document in Terraform is adding new lines and putting the version up top, which is tainting the diff.
I'm running into this issue and it seems like the underlying problem is that arrays of principals are being returned in a different order every time, which leads to a diff.
I can apply, and then re-run plan and get the following output:
E.g.
Original
--------
"AWS": [
"arn:aws:iam::123456789012:user/1",
"arn:aws:iam::123456789012:user/2"
]
New
---
"AWS": [
"arn:aws:iam::123456789012:user/2",
"arn:aws:iam::123456789012:user/1"
]
I'm going to lock this issue because it has been closed for _30 days_ โณ. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.
Most helpful comment
I'm running into this issue and it seems like the underlying problem is that arrays of principals are being returned in a different order every time, which leads to a diff.
I can
apply, and then re-runplanand get the following output:E.g.