Terraform-provider-azurerm: New Resource request: Azure Data Factory

Created on 3 Aug 2018  ยท  8Comments  ยท  Source: terraform-providers/terraform-provider-azurerm

Community Note

  • Please vote on this issue by adding a ๐Ÿ‘ reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Description

Add a new Terraform resource to create an Azure Data Factory.

New or Affected Resource(s)

  • new: azurerm_data_factory

Potential Terraform Configuration

resource "azurerm_data_factory" "data_factory" {
  name = "df1"
  resource_group_name = "rg1"
  location = "eastus"
  version = "v2"
  tags = {
    desc = "I am a little data factory"
  }

References

Appears that the Azure Go SDK has support for Data Factory:

new-resource servicdata-factory

Most helpful comment

The workflows within a data factory are really part of the configuration as well. While I would really like to see a data factory resource, it's of limited value to me without being able to configure the ingest pipelines within a data factory. So if this task is done, it would be really helpful to have resources for the input and output datasets as well as the activity types as well.

All 8 comments

The workflows within a data factory are really part of the configuration as well. While I would really like to see a data factory resource, it's of limited value to me without being able to configure the ingest pipelines within a data factory. So if this task is done, it would be really helpful to have resources for the input and output datasets as well as the activity types as well.

Any idea when this might be picked up by someone?

Need this ASAP!
๐Ÿ™๐Ÿ™๐Ÿ™๐Ÿ™๐Ÿ™

Any update if this will be implemented soon or if its being looked at all?

Another use case - Looking to migrate Datalake gen1 to gen2 and seems the recommended way is using data factory, i would like to be able to be able to spin and configure the datafactory via terraform for this so they can be easily cleaned up afterwards (As they will be short-lived just for the migrations), and also easily repeatable for the many different datalake stores in scope over time.

Closing this in favor of #3159. Thanks @hbuckle!

re-opening this (it should have been the other PR that got closed ๐Ÿ˜„)

I'm going to lock this issue because it has been closed for _30 days_ โณ. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error ๐Ÿค– ๐Ÿ™‰ , please reach out to my human friends ๐Ÿ‘‰ [email protected]. Thanks!

Was this page helpful?
0 / 5 - 0 ratings