Course Duration
1 Day

Microsoft
Authorized Training

IT

Course cost:
£990.00

IT Certification Overview

Implement a Data Analytics Solution with Azure Databricks is a 1-day intermediate course designed for data professionals who want to strengthen their skills in distributed data processing with Spark and Databricks on Azure. By the end of this learning path, you'll have built solid intermediate to advanced skills in both Databricks and Spark on Azure. You're able to ingest, transform, and analyse large-scale datasets using Spark DataFrames, Spark SQL, and PySpark, giving you confidence in working with distributed data processing. Within Databricks, you know how to navigate the workspace, manage clusters, and build and maintain Delta tables.

You'll also be capable of designing and running ETL pipelines, optimizing Delta tables, managing schema changes, and applying data quality rules. In addition, you learn how to orchestrate workloads with Lakeflow Jobs and pipelines, enabling you to move from exploration to automated workflows. Finally, you gain familiarity with governance and security features, including Unity Catalog, Purview integration, and access management, preparing you to operate effectively in production-ready data environments.

Newto Training Reviews

What Our Happy Alumni Say About Us

Prerequisites

Participants should have:

  • Working knowledge of the fundamentals and syntax of Python and SQL, including Python scripting and SQL filter, aggregate and join queries
  • A basic understanding of common file formats, JSON, CSV and Parquet
  • Familiarity with the Azure portal and foundational storage services
  • A general awareness of data concepts such as batch versus streaming processing and structured versus unstructured data

Target audience

This course is designed for professionals who are interested in working with the Databricks platform. It is well suited to in-training or current data analysts who have prior experience managing data but limited exposure to Databricks.

Learning Objectives

By the end of this course, learners will be able to:

  • Ingest, transform, and analyse large-scale datasets using Spark DataFrames, Spark SQL, and PySpark.
  • Navigate the Databricks workspace, managing clusters, and building/maintaining Delta tables.
  • Design and running ETL pipelines, managing schema changes, applying data quality rules, and optimizing Delta tables.
  • Orchestrate automated workflows using Lakeflow Jobs and pipelines.
  • Gain familiarity with Unity Catalog, Purview integration, and access management to work confidently in production-ready data environments.

Implement a Data Analytics Solution with Azure Databricks Course Content

Module 1: Explore Azure Databricks

Azure Databricks is a cloud service that provides a scalable platform for data analytics using Apache Spark.

  • Introduction
  • Get started with Azure Databricks
  • Identify Azure Databricks workloads
  • Understand key concepts
  • Data governance using Unity Catalog and Microsoft Purview
  • Exercise - Explore Azure Databricks
  • Module assessment
  • Summary

Module 2: Perform Data Analysis with Azure Databricks

Learn how to perform data analysis using Azure Databricks. Explore various data ingestion methods and how to integrate data from sources like Azure Data Lake and Azure SQL Database. This module guides you through using collaborative notebooks to perform exploratory data analysis (EDA), so you can visualize, manipulate, and examine data to uncover patterns, anomalies, and correlations.

  • Introduction
  • Ingest data with Azure Databricks
  • Data exploration tools in Azure Databricks
  • Data analysis using DataFrame APIs
  • Exercise - Explore data with Azure Databricks
  • Module assessment
  • Summary

Module 3: Use Apache Spark in Azure Databricks

Azure Databricks is built on Apache Spark and enables data engineers and analysts to run Spark jobs to transform, analyze and visualize data at scale.

  • Introduction
  • Get to know Spark
  • Create a Spark cluster
  • Use Spark in notebooks
  • Use Spark to work with data files
  • Visualize data
  • Exercise - Use Spark in Azure Databricks
  • Module assessment
  • Summary

Module 4: Manage data with Delta Lake

Delta Lake is a data management solution in Azure Databricks providing features including ACID transactions, schema enforcement, and time travel ensuring data consistency, integrity, and versioning capabilities.

  • Introduction
  • Get started with Delta Lake
  • Create Delta tables
  • Implement schema enforcement
  • Data versioning and time travel in Delta Lake
  • Data integrity with Delta Lake
  • Exercise - Use Delta Lake in Azure Databricks
  • Module assessment
  • Summary

Module 5: Build Lakeflow Declarative Pipelines

Building Lakeflow Declarative Pipelines enables real-time, scalable, and reliable data processing using Delta Lake's advanced features in Azure Databricks.

  • Introduction
  • Explore Lakeflow Declarative Pipelines
  • Data ingestion and integration
  • Real-time processing
  • Exercise - Create a Lakeflow Declarative Pipeline
  • Module assessment
  • Summary

Module 6: Deploy workloads with Lakeflow Jobs

Deploying workloads with Lakeflow Jobs involves orchestrating and automating complex data processing pipelines, machine learning workflows, and analytics tasks. In this module, you learn how to deploy workloads with Databricks Lakeflow Jobs.

  • Introduction
  • What are Lakeflow Jobs?
  • Understand key components of Lakeflow Jobs
  • Explore the benefits of Lakeflow Jobs
  • Deploy workloads using Lakeflow Jobs
  • Exercise - Create a Lakeflow Job
  • Module assessment
  • Summary

Exams and assessments

There are no formal examinations within this course. There will be a module review and summary following the practical hands-on lab, quiz and slide-deck deliveries. This will further enforce learning and support additional resource finds, for continued learning and development.

Hands-on learning

Within this course there are opportunities for learners to engage in hands-on labs to support module learning.

In addition, each module will also have a quiz, to support knowledge capture.

Upcoming Dates

Dates and locations are available on request. Please contact us for the latest schedule.

Advance Your Career with Implement a Data Analytics Solution with Azure Databricks

Gain the skills you need to succeed. Enrol in Implement a Data Analytics Solution with Azure Databricks with Newto Training today.

New Year Offer 1st Feb - 28th Feb
UP TO 80% OFF
Sale ends in
00Days
00Hours
00Mins
00Secs
Claim Discount