我正在尝试使用 terraform 创建 3 个数据湖,但出现 403 错误。
我正在使用具有所有者角色的管理员帐户。我还尝试创建一个 SP 并设置 Blob Reader Role。
在下面找到我的代码和错误
Terraform v1.2.1 在 windows_amd64 上
- 提供商 registry.terraform.io/hashicorp/azuread v2.22.0
- 提供商 registry.terraform.io/hashicorp/azurerm v3.7.0
resource "azurerm_storage_data_lake_gen2_filesystem" "stg-datalake" {
for_each = toset(["bronze", "silver", "gold"])
name = each.value
storage_account_id = azurerm_storage_account.stg-datalake.id
ace {
scope = "access"
type = "user"
id = azurerm_data_factory.adf.identity[0].principal_id
permissions = "rwx"
}
}
错误: 错误:检查是否存在现有文件系统“gold”(帐户“stgaclientteste”):datalakestore.Client#GetProperties:响应请求失败:StatusCode=403——原始错误:autorest/azure:无法解析错误响应:{“"'\x00' '\x00'} 错误:EOF
最佳答案
几个月后问题仍然存在,因此我使用了下面的解决方法。 ADLS gen2 文件系统与常规存储容器有些不同,您需要 Storage Blob Data Owner
角色来创建/更新文件系统。
data "azurerm_client_config" "current" {}
# HACK: Role assignment is needed to apply adls gen2 filesystem changes
resource "azurerm_role_assignment" "role_assignment" {
scope = var.storage_account_id
role_definition_name = "Storage Blob Data Owner"
principal_id = data.azurerm_client_config.current.object_id
}
# HACK: Sleep is needed to wait for role assignment to propagate
resource "time_sleep" "role_assignment_sleep" {
create_duration = "60s"
triggers = {
role_assignment = azurerm_role_assignment.role_assignment.id
}
}
resource "azurerm_storage_data_lake_gen2_filesystem" "filesystem" {
name = var.filesystem_name
storage_account_id = var.storage_account_id
depends_on = [time_sleep.role_assignment_sleep]
}
关于Terraform - AzureDataLake 创建错误响应请求 : StatusCode=403 失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/72382379/