Write us what you want & improve the DevOpsCloud website for easy to use.

To stop spammers/bots in Telegram, we have added a captcha while joining the Telegram group, which means every new member, has to authenticate within 60 seconds while joining the group.

Home >>All Articles

Published Articles (117)

Sort by:
  • All |
  • AWS (52) | 
  • Azure (31) | 
  • DevOps (9) | 
  • FREE Udemy Courses (6) | 
  • GCP (1) | 
  • Linux (1) | 

AVR posted:
3 years ago
We have different ways of accessing Data lake storage.
A)Mount ADLS Containers to DBFS using service principal and OAuth 2.0
B)Use access keys directly

We should also be aware of the following
Creating App Registration
Creating Key Vault Service
Creating Azure Databricks Workspace & spinning up Databricks cluster
Creating Databricks Secret scope for connectivity
Creating mount point
Reading different format files
Reading multiple files
Writing dataframe as CSV file
Writing dataframe as parquet file



Assume that we have data in Datasource
Considering that we have uploaded data in Azure Data Lake
If we have to access the data in Azure Data Lake, then the below are steps we need to follow
1)Read the data from the source
2)Apply transformations as per the business needs
3)write data back to the target/sink(Ex: SQL Database)

How to read the data in Data Lake?
We need to register a new App at App Registration, and after registering, we get credentials
We cannot access Azure Data Lake with the App credentials directly
We need to add App in Azure Data Lake to have communication
We also have Key Vault to store all sensitive details like secrets
We store all App credentials as secrets in Key vault
Databricks connects to the key vault; from here, both read and write operations can happen easily
Posted in: Azure | ID: Q68 |
August 28, 2021, 09:46 AM | 1 Replies