Use secrets in Databricks
dbutils.secrets.help()dbutils.secrets.listScopes()dbutils.secrets.list(scope="ADBDataLakeKeyVault")dbutils.secrets.list(scope="DataLakeKeyVault")Reference
最后更新于
dbutils.secrets.help()dbutils.secrets.listScopes()dbutils.secrets.list(scope="ADBDataLakeKeyVault")dbutils.secrets.list(scope="DataLakeKeyVault")最后更新于
dbutils.secrets.get(scope="ADBDataLakeKeyVault", key="ADSL-AccountKey")spark.conf.set(
"fs.azure.account.key." + dbutils.secrets.get(scope="DataLakeKeyVault",key="ADSL-AccountName") + ".dfs.core.windows.net",
dbutils.secrets.get(scope="DataLakeKeyVault",key="ADSL-AccountKey"))filePath = "abfss://" + dbutils.secrets.get(scope="DataLakeKeyVault",key="ADSL-ContainerName-RawData") + "@" + dbutils.secrets.get(scope="DataLakeKeyVault",key="ADSL-AccountName") + ".dfs.core.windows.net/"dbutils.fs.ls(filePath)dfCustomer = spark.read.format("csv") \
.options(header='true', inferSchema='true') \
.load(filePath + "Customer.csv")dfCustomer.createOrReplaceTempView("customer")%sql
select * from customer