Course Description
This is a 2-day introductory course that serves as an appropriate entry point to learn Data Engineering with Databricks.
The introductory Databricks training covers the following topics:
Data Ingestion with Delta Lake
This course is designed for Data Engineers to deepen their understanding of Delta Lake to handle data ingestion, transformation, and management with ease. Using the latest features of Delta Lake, learners will explore real-world applications to enhance data workflows, optimize performance, and ensure data reliability.
Deploy Workloads with Databricks Workflows
This course is designed for data engineer professionals who are looking to leverage Databricks for streamlined and efficient data workflows. By the end of this course, you’ll be well-versed in using Databricks’ Jobs and Workflows functionalities to automate, manage, and monitor complex data pipelines. The course includes hands-on labs and best practices to ensure a deep understanding and practical ability to manage workflows in production environments.
Build Data Pipelines with Delta Live Tables
This comprehensive course is designed to understand the Medallion Architecture using Delta Live Tables. Participants will learn how to create robust and efficient data pipelines for structured and unstructured data, understand the nuances of managing data quality, and unlock the potential of Delta Live Tables. By the end of this course, participants will have hands-on experience building pipelines, troubleshooting issues, and monitoring their data flows within the Delta Live Tables environment.
Data Management and Governance with Unity Catalog
In this course, you’ll learn about data management and governance using Databricks Unity Catalog. It covers foundational concepts of data governance, complexities in managing data lakes, Unity Catalog’s architecture, security, administration, and advanced topics like fine-grained access control, data segregation, and privilege management.
* This course seeks to prepare students to complete the Associate Data Engineering certification exam, and provides the requisite knowledge to take the course Advanced Data Engineering with Databricks.
Prerequisites
- Beginner familiarity with basic cloud concepts (virtual machines, object storage, identity management)
- Ability to perform basic code development tasks (create compute, run code in notebooks, use basic notebook operations, import repos from git, etc.)
- Intermediate familiarity with basic SQL concepts (CREATE, SELECT, INSERT, UPDATE, DELETE, WHILE, GROUP BY, JOIN, etc.)
- Intermediate experience with basic SQL concepts such as SQL commands, aggregate functions, filters and sorting, indexes, tables, and views.
- Basic knowledge of Python programming, jupyter notebook interface, and PySpark fundamentals.
Course Summary
Next Public Course Dates | |
Duration |
|
Prerequisites |
|
Available Formats |
|
Audience |
|
Testimonials
The course was comprehensive and interactive, and the trainer really understood the content. I have been able to implement much of what I learned into my daily work activities and have saved a lot of time.
, Advocate Health Care
Without a doubt one of the best professional development courses I’ve taken. I would highly recommend.
, KPERS
Hands-down the best technical training I have taken! The methods of progressively building reports, adding complexity and features as time goes on, was fantastic.
, Cirrus Aircraft
Course Modules
Day 1
Data Ingestion with Delta Lake
Delta Lake and Data Objects
Set Up and Load Delta Tables
Basic Transformations
Load Data Lab
Cleaning Data
Complex Transformations
SQL UDFs
Advanced Delta Lake Features
Manipulate Delta Tables Lab
Deploy Workloads with Databricks Workflows
Introduction to Workflows
Jobs Compute
Scheduling Tasks with the Jobs UI
Workflows Lab
Jobs Features
Explore Scheduling Options
Conditional Tasks and Repairing Runs
Modular Orchestration
Databricks Workflows Best Practices