DP-3011: Implementing a Data Analytics Solution with Azure Databricks

Length: 1 Day(s)     Cost:$895 + GST

= Scheduled class     = Guaranteed to run     = Fully booked

Click on the date to book online
Please wait as we are loading the schedules...
LOCATION May June July August
Auckland
Hamilton
Christchurch
Wellington
Virtual Class
Learn how to harness the power of Apache Spark and powerful clusters running on the Azure Databricks platform to run large data engineering workloads in the cloud.


Microsoft Applied Skills

Microsoft Applied Skills are scenario-based credentials that provide learners with validation of targeted skills. These credentials are an efficient and trusted way to identify and deepen proficiency in scenario-based skillsets. The interactive training and validation enable learners to demonstrate proficiency by completing real-world tasks.

Applied Skills can help students prepare for the workforce by providing them with real-world problem-solving experience and validation of their skills.



This course is for Data Engineers.


Fundamental knowledge of data analytics concepts.


After completing this course, students will be able to:

  • Provision an Azure Databricks workspace
  • Identify core workloads and personas for Azure Databricks
  • Describe key concepts of an Azure Databricks solution
  • Describe key elements of the Apache Spark architecture
  • Create and configure a Spark cluster
  • Describe use cases for Spark
  • Use Spark to process and analyse data stored in files
  • Use Spark to visualise data
  • Describe core features and capabilities of Delta Lake
  • Create and use Delta Lake tables in Azure Databricks
  • Create Spark catalog tables for Delta Lake data
  • Use Delta Lake tables for streaming data
  • Create and configure SQL Warehouses in Azure Databricks
  • Create databases and tables
  • Create queries and dashboards
  • Describe how Azure Databricks notebooks can be run in a pipeline
  • Create an Azure Data Factory linked service for Azure Databricks
  • Use a Notebook activity in a pipeline
  • Pass parameters to a notebook

  • Explore Azure Databricks
  • Use Apache Spark in Azure Databricks
  • Use Delta Lake in Azure Databricks
  • Use SQL Warehouses in Azure Databricks
  • Run Azure Databricks Notebooks with Azure Data Factory