_This issue was originally opened by @rjinski as hashicorp/terraform#10373. It was migrated here as part of the provider split. The original body of the issue is below._
Terraform v0.7.11
aws_s3_bucket
provider "aws" {
region = "eu-west-1"
}
provider "aws" {
alias = "central"
region = "eu-central-1"
}
data "aws_iam_policy_document" "foo_policy" {
statement {
effect = "Allow"
principals = {
type = "Service"
identifiers = ["ec2.amazonaws.com"]
}
actions = [
"s3:*"
]
resources = [
"${aws_s3_bucket.foo.arn}"
]
}
}
resource "aws_s3_bucket" "foo" {
provider = "aws.central"
bucket = "foobar-was-once-a-soldier"
}
resource "aws_s3_bucket_policy" "bar" {
provider = "aws.central"
bucket = "${aws_s3_bucket.foo.bucket}"
policy = "${data.aws_iam_policy_document.foo_policy.json}"
}
terraform import aws_s3_bucket.foo foobar-was-once-a-soldier
aws_s3_bucket.foo: Importing from ID "foobar-was-once-a-soldier"...
aws_s3_bucket.foo: Import complete!
Imported aws_s3_bucket (ID: foobar-was-once-a-soldier)
Imported aws_s3_bucket_policy (ID: foobar-was-once-a-soldier)
aws_s3_bucket_policy.foo: Refreshing state... (ID: foobar-was-once-a-soldier)
aws_s3_bucket.foo: Refreshing state... (ID: foobar-was-once-a-soldier)
Import success! The resources imported are shown above. These are
now in your Terraform state. Import does not currently generate
configuration, so you must do this next. If you do not create configuration
for the above resources, then the next `terraform plan` will mark
them for destruction.
terraform apply
aws_s3_bucket.foo: Refreshing state... (ID: foobar-was-once-a-soldier)
aws_s3_bucket_policy.foo: Refreshing state... (ID: foobar-was-once-a-soldier)
data.aws_iam_policy_document.foo_policy: Refreshing state...
aws_s3_bucket_policy.foo: Destroying...
aws_s3_bucket.foo: Modifying...
acl: "" => "private"
force_destroy: "" => "false"
aws_s3_bucket.foo: Modifications complete
aws_s3_bucket_policy.bar: Creating...
bucket: "" => "foobar-was-once-a-soldier"
policy: "" => "{\n \"Version\": \"2012-10-17\",\n \"Statement\": [\n {\n \"Sid\": \"\",\n \"Effect\": \"Allow\",\n \"Action\": \"s3:*\",\n \"Resource\": \"arn:aws:s3:::foobar-was-once-a-soldier\",\n \"Principal\": {\n \"Service\": \"ec2.amazonaws.com\"\n }\n }\n ]\n}"
aws_s3_bucket_policy.bar: Creation complete
Error applying plan:
1 error(s) occurred:
* aws_s3_bucket_policy.foo: Error deleting S3 policy: BucketRegionError: incorrect region, the bucket is not in 'eu-west-1' region
status code: 301, request id: 797D7E50FDF0A308
Terraform does not automatically rollback in the face of errors.
Instead, your Terraform state file has been partially updated with
any resources that successfully completed. Please address the error
above and apply again to incrementally change your infrastructure.
I expect when importing the S3 bucket on a different region that the provider alias and the import system would work in harmony.
The terraform apply appears to try and use the imported resource with the default provider and not the alias provider defined.
foobar-was-once-a-soldier in eu-central-1terraform import aws_s3_bucket.foo foobar-was-once-a-soldierterraform applyStill happening in 0.9.8, when I try to import a s3 bucket with a policy in a different region/alias, I get * import aws_s3_bucket.BUCKET (id: BUCKET): Error importing AWS S3 bucket policy: BucketRegionError: incorrect region, the bucket is not in 'us-west-1' region
The removal of the automatic aws_s3_bucket_policy resource import during aws_s3_bucket resource import has been merged and will release with version 3.0.0 of the Terraform AWS Provider, likely in two weeks. Please follow the v3.0.0 milestone for tracking the progress of that release. You can use the aws_s3_bucket_policy resource import support to import that resource directly after the provider upgrade, similar to all other Terraform resources that support import. 👍
This has been released in version 3.0.0 of the Terraform AWS provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading.
For further feature requests or bug reports with this functionality, please create a new GitHub issue following the template for triage. Thanks!
I'm going to lock this issue because it has been closed for _30 days_ ⏳. This helps our maintainers find and focus on the active issues.
If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. Thanks!
Most helpful comment
Still happening in 0.9.8, when I try to import a s3 bucket with a policy in a different region/alias, I get
* import aws_s3_bucket.BUCKET (id: BUCKET): Error importing AWS S3 bucket policy: BucketRegionError: incorrect region, the bucket is not in 'us-west-1' region