_This issue was originally opened by @mlimaloureiro as hashicorp/terraform#11723. It was migrated here as part of the provider split. The original body of the issue is below._
Hi Guys,
I would like to request Terraform to support Kinesis Firehose
data transformation.
More info here
Thanks!
Even with extended_s3_configuration, we shall have processing_configuration in each destination possibility ( elasticsearch / redshift / s3 ) .
http://docs.aws.amazon.com/firehose/latest/APIReference/API_CreateDeliveryStream.html
This is kind of messed up atm because redshift doesn't support extended_s3_configuration, only s3_configuration.
The Elasticsearch destination requires the legacy s3_configuration as well.
Compared with the AWS CLI's "aws firehose describe-delivery-stream", the processing configuration should apply to the delivery stream instead of the "extended_s3_configuration".
Would love to have this for my ElasticSearch firehose.
Having to manually enable data transformation on our redshift firehoses is a bummer, I'd love to see support for extended_s3_configuration in Redshift firehoses.
I'd also like to see this for redshift. From the API it appears to be set on the redshift destination configuration, so it should be completely independent of any s3 config.
I'm reading through the docs, and this issue, and I'm still not clear. I have an elastic search firehose, that I need to recreate with terraform that uses a lambda to convert syslogs to json.
Is there no way to do this in terraform?
OK, confirming that this doesnt work.
When destination is elasticsearch, s3_configuration is required
Hi, is there a way (with Terraform) to configure Firehose in such a way, that I can have my data streamed into a bucket and simultaneously have a lambda function parse the data and get the data stored in a secound bucket? I can't seem to make it work.
Just landed here after having a conflict between extended_s3_configuration and s3_configuration to try with elasticsearch:
Error: aws_kinesis_firehose_delivery_stream.test_stream: "extended_s3_configuration": conflicts with s3_configuration ([]map[string]interface {}{map[string]interface {}{"compression_format":"GZIP", "role_arn":"${aws_iam_role.main.arn}", "bucket_arn":"arn:aws:s3:::somebucket/folder", "buffer_size":10, "buffer_interval":400}})
We have also stumbled upon this issue while trying to configure Firehose Delivery stream to transform data before pushing it to Redshift cluster. When does Terraform plan to support this extended_s3_configuration for Redshift destination?
Is there someone who is available to confirm I'm looking in the correct places in the code? I'd like to add support for this
Is there anything else that we can provide on the commits above (e.g. #3621) that would help to get them reviewed and hopefully merged?
Elasticsearch and Splunk processing configuration support is now merged in and will release with v1.14.0, likely before the end of the week. I need to get a crash prevention PR in for #4033, but can also submit the PR for Redshift afterwards.
Elasticsearch and Splunk processing configuration support has been released in version 1.14.0 of the AWS provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading.
Hi,
Is processing configuration planned for Redshift destination ?
Redshift PR submitted: #4251
Redshift support for processing_configuration
has been merged and will release with v1.16.0 of the AWS provider, likely in a few days. Thanks everyone!
The last bits to support Redshift have been released in version 1.16.0 of the AWS provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading.
I'm going to lock this issue because it has been closed for _30 days_ โณ. This helps our maintainers find and focus on the active issues.
If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. Thanks!
Most helpful comment
Redshift support for
processing_configuration
has been merged and will release with v1.16.0 of the AWS provider, likely in a few days. Thanks everyone!