Building on #2427, please add support for configuring a start task for a pool.
I'm a Terraform rookie, so take this suggestion with a pound of salt.
# Configure the Microsoft Azure Provider
provider "azurerm" {
# if you're using a Service Principal (shared account) then either set the environment variables, or fill these in: # subscription_id = "..." # client_id = "..." # client_secret = "..." # tenant_id = "..."
}
resource "azurerm_resource_group" "rg" {
name = "${var.resource_group_name}"
location = "${var.location}"
}
resource "random_integer" "ri" {
min = 10000
max = 99999
}
resource "azurerm_storage_account" "stor" {
name = "stor${random_integer.ri.result}"
resource_group_name = "${azurerm_resource_group.rg.name}"
location = "${azurerm_resource_group.rg.location}"
account_tier = "${var.storage_account_tier}"
account_replication_type = "${var.storage_replication_type}"
}
resource "azurerm_batch_account" "batch" {
name = "batch${random_integer.ri.result}"
resource_group_name = "${azurerm_resource_group.rg.name}"
location = "${azurerm_resource_group.rg.location}"
storage_account_name = "${azurerm_storage_account.stor.name}"
}
data "azurerm_batch_auto_user" "pool_admin"
{
scope = "pool" # or job
elevated = true
}
resource "azurerm_batch_pool" "pool" {
id = "pool${random_integer.ri.result}"
vm_size = "${var.batch_pool_nodes_vm_size}"
target_dedicated_node = "${var.batch_pool_nodes_count}"
vm_image = "${var.batch_pool_nodes_vm_image}"
node_agent_sku_id = "${var.batch.pool_nodes_agent_sku_id}"
start_task {
command_line = "/bin/bash -c 'apt-get -y update && apt-get -y install cowsay && cowsay Hello, $person!'"
user_identity = "${data.azurerm_batch_auto_user.pool_admin}"
max_task_retry_count = 1
environment {
person = "World"
}
}
}
@MHHenriksen @katbyte - will look at this now that Batch pool feature (#2461) is implemented.
@MHHenriksen - it looks like @jcorioland has implemented the start_task: https://www.terraform.io/docs/providers/azurerm/d/batch_pool.html#start_task
Does that do what you need?
Yeah, looks good! We'll let you know if it breaks via new issues.
I'm going to lock this issue because it has been closed for _30 days_ ⏳. This helps our maintainers find and focus on the active issues.
If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 [email protected]. Thanks!
Most helpful comment
Yeah, looks good! We'll let you know if it breaks via new issues.