0% found this document useful (0 votes)
13 views

Data Bricks

The document discusses Azure Data Bricks including creating a workspace, clusters and notebooks. It covers using Data Bricks to run Spark jobs and accessing file systems. Mounting external storage and managing secrets is also covered.

Uploaded by

vasu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Data Bricks

The document discusses Azure Data Bricks including creating a workspace, clusters and notebooks. It covers using Data Bricks to run Spark jobs and accessing file systems. Mounting external storage and managing secrets is also covered.

Uploaded by

vasu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 43

Create Azure data Bricks work space:

Work Space in Azure Data Bricks:


Work space assets in Data Bricks:
Working with Data Bricks Objects:
Create and Run SPARK IN Data Bricks:
ClusterComputation Resource

NotebookScript

Create A cluster First:

select gender,avg(salary) from Empdata group by gender;


Azure Data Bricks Architecture over view:

Data Bricks File system:


11:

Data utility in Data Bricks:


12:
Copy file from one location to other location:

To see the data in file:


To Create new folder use ‘mkdirs’:

Rm—command will help to remove file from directory.

ls-to get the list of all the files:


mv –to move the file from one location to other and moved file will be deleted
from original path.

“Put” command-will help to add content to the file:

13:
14:

Used to run a notebook from another notebook.


dbutils.notebook.run('/Users/[email protected]/Run command’,60)
Drop down:
16.
17.
Mount pint is nothing but attaching your external storage with databricks file
system
# Mount the Azure Blob Storage to DBFS

dbutils.fs.mount(

source = "wasbs://" + container_name + "@" + storage_account_name +


".blob.core.windows.net/" + blob_relative_path,

mount_point = mount_point,

extra_configs = {"fs.azure.account.key." + storage_account_name +


".blob.core.windows.net": storage_account_access_key}

# Replace these variables with your actual values

storage_account_name = "your_storage_account_name"

container_name = "your_container_name"

blob_relative_path = "path_to_your_blob_file"

mount_point = "/mnt/blob_mount_point"

19.
How to delete or unmount

# Define the mount point you want to unmount

mount_point = "/mnt/blob_mount_point"

# Unmount the mount point

dbutils.fs.unmount(mount_point)
21.

22.
23.
Scopesecretshere storing actual secret values

24.
CLI-command line interface.

25.
26
Keep your secrets inside secret scope

27
29.
How to create service principal:

Go to Azure Active directory.

Create a password for service principal ID


30.
31.
*********************ALL THE BEST************************

You might also like