Have you run into an issue within Databricks where you receive the message?

"PermissionError: [Errno 13] Permission denied: '/dbfs/mnt/youradlsmountname'".

Problem

If you encounter a pipeline failure or monitoring alert with this message, the issue is likely attributed to the expiration of an ADLS SAS key.

It is crucial to regularly monitor your keys or store their expiry dates in a system that will provide alerts as the expiration date approaches. This proactive approach ensures timely action and prevents any disruptions due to key expiry.

Trying to List Your Mount with DBUTILS

If you run the following python code you may receive the following error as well: "com.databricks.backend.daemon.data.common.InvalidMountException: Error while using path /mnt/youradlsmountname for resolving path '/' within mount at '/mnt/youradlsmountname'."

Code sample by Cloudaen
dbutils.fs.ls("/mnt/youradlsmountname")

Solution

Unmount your Mount

1. First, start off by removing your mount. Don't worry your files will be safe as they are stored in ADLS and not Azure Databricks. You can do this by running the following command:

Code sample by Cloudaen
dbutils.fs.unmount("/mnt/youradlsmountname")

2. Next, restart your node that you are connected to.

Generate SAS Token

3. Now that the mount has been removed, you are going to want to generate a new SAS Key from Azure Datalake Storage (ADLS):

a. Go to ADLS > under Security + Networking > Shared access signature

b. Select your Allowed resource types > click Generate SAS and connection string > save the Blob SAS token

4. Take note of your ADLS storage account name as it will be required:

5. Take note of your ADLS container name as it will be required > go to Containers > record the name of the container you are trying to mount:

Get your Access Key

6. You will also need your Access Key > go to Access keys > click Show > copy the key to clipboard:

7. Now, open a new notebook in Azure Databricks (python).

Remount your ADLS with new SAS Token

8. Use the following code inserted with your Access key and SAS Token:

Code sample by Microsoft
storageAccountName = "nameOfYourStorageAccount"
storageAccountAccessKey = "AccessKey"
sasToken = "BlobSAStoken"
blobContainerName = "yourContainerName"
mountPoint = "/mnt/youradlsmountname/"
if not any(mount.mountPoint == mountPoint for mount in dbutils.fs.mounts()):
  try:
    dbutils.fs.mount(
      source = "wasbs://{}@{}.blob.core.windows.net".format(blobContainerName, storageAccountName),
      mount_point = mountPoint,
      extra_configs = {'fs.azure.sas.' + blobContainerName + '.' + storageAccountName + '.blob.core.windows.net': sasToken}
    )
    print("mount succeeded!")
  except Exception as e:
    print("mount exception", e)

Once you run the following code, you should see "mount succeeded!" You should now be able to access the files in your mount from Azure Datalake Storage (ADLS).

Summary

At times, inheriting an infrastructure or codebase can be frustrating, especially without proper documentation. Errors can be vague, providing little insight into their origin. This article aims to assist you in resolving the "Errno 13 Permission denied" error by guiding you through the process of remounting your ADLS to Azure Databricks, enabling access to your datalake storage.

Related Documentation

Generate SAS Token

Mounting cloud object storage on Databricks

Mount ADLS Gen2 or Blob Storage in Azure Databricks