Databricks and Azure Files

2020-04-23 08:16发布

I need to access Azure Files from Azure Databricks. According to the documentation Azure Blobs are supported but I am need this code to work with Azure files:

dbutils.fs.mount(
  source = "wasbs://<your-container-name>@<your-storage-account-name>.file.core.windows.net",
  mount_point = "/mnt/<mount-name>",
  extra_configs = {"<conf-key>":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})

or is there another way to mount/access Azure Files to/from a Azure Databricks cluster? Thanks

1条回答
Evening l夕情丶
2楼-- · 2020-04-23 08:39

On Azure, generally you can mount a file share of Azure Files to Linux via SMB protocol. And I tried to follow the offical tutorial Use Azure Files with Linux to do it via create a notebook in Python to do the commands as below, but failed.

enter image description here

It seems that Azure Databricks does not allow to do that, even I searched about mount NFS, SMB, Samba, etc. in Databricks community that there is not any discussion.

So the only way to access files in Azure Files is to install the azure-storage package and directly to use Azure Files SDK for Python on Azure Databricks.

查看更多
登录 后发表回答