Azure-docs: 403 StatusDescription=This request is not authorized to perform this operation using this permission.

Created on 19 Feb 2019  Â·  14Comments  Â·  Source: MicrosoftDocs/azure-docs

Hi All,

I'm getting the following error when followed https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-quickstart-create-databricks-account.
The problem was very much the same as
HEAD https://github.com/MicrosoftDocs/azure-docs/issues/24686, but no detail how to resolve the issue.

HEAD https://megdpdatalakev2.dfs.core.windows.net/megdp?resource=filesystem&timeout=90

StatusCode=403
StatusDescription=This request is not authorized to perform this operation using this permission.
ErrorCode=
ErrorMessage=
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:133)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getFilesystemProperties(AbfsClient.java:197)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFilesystemProperties(AzureBlobFileSystemStore.java:214)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.fileSystemExists(AzureBlobFileSystem.java:749)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.initialize(AzureBlobFileSystem.java:110)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
at com.databricks.backend.daemon.dbutils.FSUtils$.getFS(DBUtilsCore.scala:248)
at com.databricks.backend.daemon.dbutils.FSUtils$$anonfun$ls$1.apply(DBUtilsCore.scala:83)
at com.databricks.backend.daemon.dbutils.FSUtils$$anonfun$ls$1.apply(DBUtilsCore.scala:82)
at com.databricks.backend.daemon.dbutils.FSUtils$.com$databricks$backend$daemon$dbutils$FSUtils$$withFsSafetyCheck(DBUtilsCore.scala:78)
at com.databricks.backend.daemon.dbutils.FSUtils$.ls(DBUtilsCore.scala:82)
at com.databricks.dbutils_v1.impl.DbfsUtilsImpl.ls(DbfsUtilsImpl.scala:33)
at linefe65746de81f4c338da4d9e298c52af125.$read$$iw$$iw$$iw$$iw$$iw$$iw.(command-1728851972204306:7)
at linefe65746de81f4c338da4d9e298c52af125.$read$$iw$$iw$$iw$$iw$$iw.(command-1728851972204306:60)
at linefe65746de81f4c338da4d9e298c52af125.$read$$iw$$iw$$iw$$iw.(command-1728851972204306:62)
at linefe65746de81f4c338da4d9e298c52af125.$read$$iw$$iw$$iw.(command-1728851972204306:64)
at linefe65746de81f4c338da4d9e298c52af125.$read$$iw$$iw.(command-1728851972204306:66)
at linefe65746de81f4c338da4d9e298c52af125.$read$$iw.(command-1728851972204306:68)
at linefe65746de81f4c338da4d9e298c52af125.$read.(command-1728851972204306:70)
at linefe65746de81f4c338da4d9e298c52af125.$read$.(command-1728851972204306:74)
at linefe65746de81f4c338da4d9e298c52af125.$read$.(command-1728851972204306)
at linefe65746de81f4c338da4d9e298c52af125.$eval$.$print$lzycompute(:7)
at linefe65746de81f4c338da4d9e298c52af125.$eval$.$print(:6)
at linefe65746de81f4c338da4d9e298c52af125.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:199)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:189)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:189)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:189)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:534)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:489)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:189)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$5.apply(DriverLocal.scala:273)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$5.apply(DriverLocal.scala:253)
at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:235)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:230)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:42)
at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:268)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:42)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:253)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:589)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:589)
at scala.util.Try$.apply(Try.scala:192)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:584)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:475)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:542)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:381)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:328)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:215)
at java.lang.Thread.run(Thread.java:748)


Document Details

⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Pri2 cxp product-issue storagsvc triaged

Most helpful comment

@chantong47 Recently, I have received the same error message. Once have provided the required permission, it started working as excepted.
image

Make sure to assign the Blob Storage Contributor Role to the service principal in the scope of the Data Lake Storage Gen2 storage account.

Hope this helps.

All 14 comments

@chantong47 Thanks for the feedback! We are currently investigating and will update you shortly.

@chantong47 Recently, I have received the same error message. Once have provided the required permission, it started working as excepted.
image

Make sure to assign the Blob Storage Contributor Role to the service principal in the scope of the Data Lake Storage Gen2 storage account.

Hope this helps.

Yes this should resolve the issue. Note that you can assign a role to parent subscription or resource group, but it takes time for the assignment to propagate down to the storage account. If you assign the role to the subscription or resource group and then create the storage account after that, the assignment should be inherited immediately. In terms of completing tutorial, if you already have a storage account that you're using, it's best to assign directly to the storage account so that you aren't blocked and you don't have to wait.

@chantong47 We will now proceed to close this thread. If there are further questions regarding this matter, please comment and we will gladly continue the discussion.

Hi all,

I have the same problem when trying to mount ADLS Gen2 to DBFS with this config :

configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": clientID,
"fs.azure.account.oauth2.client.secret": keyID,
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/" + tenantID + "/oauth2/token"}

Someone can help me please ?

ExecutionError: An error occurred while calling o296.mount.
: HEAD https://...?action=getAccessControl&timeout=90
StatusCode=403
StatusDescription=This request is not authorized to perform this operation.
ErrorCode=
ErrorMessage=
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:134)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getAclStatus(AbfsClient.java:498)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getIsNamespaceEnabled(AzureBlobFileSystemStore.java:164)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:445)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:362)
at com.databricks.backend.daemon.dbutils.DBUtilsCore.verifyAzureFileSystem(DBUtilsCore.scala:486)
at com.databricks.backend.daemon.dbutils.DBUtilsCore.mount(DBUtilsCore.scala:435)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
at py4j.Gateway.invoke(Gateway.java:295)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:251)
at java.lang.Thread.run(Thread.java:748)

I am facing same issue in databricks 5.5 runtime as well when trying to access the files after mounting the adls gen2 container.

ExecutionError: An error occurred while calling z:com.databricks.backend.daemon.dbutils.FSUtils.ls. : GET https://xxxxxx.dfs.core.windows.net/bac?resource=filesystem&maxResults=5000&directory=sales&timeout=90&recursive=false StatusCode=403 StatusDescription=This request is not authorized to perform this operation using this permission. ErrorCode=AuthorizationPermissionMismatch ErrorMessage=This request is not authorized to perform this operation using this permission.

dp80, could you solve the problem?

Do we have an answer to this question. I've gotten the same error.
Permissions have been granted on the container side of the data lake, role based permission has been assigned (Contributor, storage blob owner), Ad authentication is being used. Still getting the access control error, 403 "This request is not authorized to perform this operation using this permission"

is there anyway to debug this ?

I am facing exactly this issue after I mount my Data lake to the databricks.

I have the same issue too.

image

I was also getting error - "This request is not authorized to perform this operation using this permission" in the "Create storage account container" step with "Storage Blob Data Contributor" at "subscription" level. I added permission "Azure Blob Data Owner" on the storage account to the service principal at the "storage account" level. Then the script worked.

I was also getting error - "This request is not authorized to perform this operation using this permission" in the "Create storage account container" step with "Storage Blob Data Contributor" at "subscription" level. I added permission "Azure Blob Data Owner" on the storage account to the service principal at the "storage account" level. Then the script worked.

Storage Blob Data Owner at the storage account level works.
Subscription level IAM is not needed

Was this page helpful?
0 / 5 - 0 ratings

Related issues

varma31 picture varma31  Â·  3Comments

behnam89 picture behnam89  Â·  3Comments

JamesDLD picture JamesDLD  Â·  3Comments

spottedmahn picture spottedmahn  Â·  3Comments

Ponant picture Ponant  Â·  3Comments