# Create Azure Databricks Cluster - Azure Data Lake Storage Credential Passthrough

## Create Cluster w/ Azure Data Lake Storage Credential Passthrough

![](https://2555840674-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MIIWE47MSPMOIxmLgUz%2F-MdGKuZ-zriRADaPpkEi%2F-MdGL-5SaZRHuVHEODP2%2F001-Azure%20Data%20Lake%20Storage%20Credential%20Passthrough.png?alt=media\&token=8311d127-558e-4b74-865c-f3af04d15dba)

## To test to telnet remote databases

![](https://2555840674-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MIIWE47MSPMOIxmLgUz%2F-MdGKuZ-zriRADaPpkEi%2F-MdGL-5TeqDudb31xfLS%2F002-Azure%20Data%20Lake%20Storage%20Credential%20Passthrough.png?alt=media\&token=2911f4c5-be1d-4f35-9206-fc8cb90cad98)

> We can see outbound network has been blocked.

## How do we allow outbound traffic for specific port

![](https://2555840674-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MIIWE47MSPMOIxmLgUz%2F-MdGM7Xsh-j43Q5m8KNx%2F-MdGMh5qz04EbQOsrMpj%2F003-Azure%20Data%20Lake%20Storage%20Credential%20Passthrough.png?alt=media\&token=bfa1a354-859b-45d6-8ca3-8095f99d4a75)

> Add ***static*** config parameter to Spark Cluster's config.

```
spark.databricks.pyspark.iptable.outbound.whitelisted.ports Port[,Ports][,PortBegin:PortEnd]
```

> If as dynamic parameter will got below error information:

![](https://2555840674-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MIIWE47MSPMOIxmLgUz%2F-MdGNi-Hp9nFbJydniTq%2F-MdGOGL9eSG6XzOl9iwO%2F004-Azure%20Data%20Lake%20Storage%20Credential%20Passthrough.png?alt=media\&token=a5a1681d-9036-439f-abe2-cfff7152b882)

## Appendix

{% embed url="<https://docs.microsoft.com/en-us/azure/databricks/security/data-governance#credential-passthrough>" %}
