Trying to write data into SQl DW through databricks stream data frame. process is trying to delete the temp folder in the BLOB storage and throwing below. In the documentation i see that process will not automatically cleanup tempdir. Is it true? if true, then why is this error? Using below query in python
df1.writeStream
.format("com.databricks.spark.sqldw")
.option("url", sqlDwUrlSmall)
.option("tempDir", tempDir)
.option("forwardSparkAzureStorageCredentials", "true")
.option("dbTable", "SampleTable")
.option("checkpointLocation", "/tmp_checkpoint_location1")
.option("numStreamingTempDirsToKeep", -1)
.start()
ERROR AzureNativeFileSystemStore: Encountered Storage Exception for delete on Blob: https://savupputest1.blob.core.windows.net/container1/tempDirs/2019-12-20/21-27-29-347/adca2ed6-a705-4274-8c24-0f0e3d7c64a7/batch0, Exception Details: This operation is not permitted on a non-empty directory. Error Code: DirectoryIsNotEmpty
19/12/20 21:27:32 ERROR AzureNativeFileSystemStore: Failed while attempting to delete key tempDirs/2019-12-20/21-27-29-347/adca2ed6-a705-4274-8c24-0f0e3d7c64a7/batch0
⚠Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.
Duplicate of #45161
Hello,
I just followed the documentation and having issues. Do you have any other
working example?
On Mon, Dec 23, 2019 at 9:23 PM CHEEKATLAPRADEEP-MSFT <
[email protected]> wrote:
Closed #45158 https://github.com/MicrosoftDocs/azure-docs/issues/45158.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/MicrosoftDocs/azure-docs/issues/45158?email_source=notifications&email_token=AF4UK4VO6RYNCABK255HFUTQ2GFEFA5CNFSM4J6X6YTKYY3PNVWWK3TUL52HS4DFWZEXG43VMVCXMZLOORHG65DJMZUWGYLUNFXW5KTDN5WW2ZLOORPWSZGOVU33M4I#event-2906109553,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AF4UK4TI2WO45EDQJLBTLDDQ2GFEFANCNFSM4J6X6YTA
.
@sandeep8530 You may checkout the answer on the MSDN forum: https://social.msdn.microsoft.com/Forums/en-US/e8e6ab46-fdd0-4916-9f3c-2019e2993c3a/issue-while-writing-data-from-databricks-to-azure-dw-synapse?forum=AzureDatabricks
Hope this helps.
Whoever comes across this error and doesnt know how to solve it:
Make sure to not use Data Lake Storage Gen2 for the tempDir but use a regular blob storage account.
Whoever comes across this error and doesnt know how to solve it:
Make sure to not use Data Lake Storage Gen2 for thetempDirbut use a regular blob storage account.
Thanks a lot for mentioning this. Saved me a bunch of time! And yes, this works.
Most helpful comment
Whoever comes across this error and doesnt know how to solve it:
Make sure to not use Data Lake Storage Gen2 for the
tempDirbut use a regular blob storage account.