Tags
Language
Tags
July 2025
Su Mo Tu We Th Fr Sa
29 30 1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31 1 2
    Attention❗ To save your time, in order to download anything on this site, you must be registered 👉 HERE. If you do not have a registration yet, it is better to do it right away. ✌

    ( • )( • ) ( ͡⚆ ͜ʖ ͡⚆ ) (‿ˠ‿)
    SpicyMags.xyz

    Dp-700: Implementing Data Engineering Solutions Using Fabric 2025

    Posted By: ELK1nG
    Dp-700: Implementing Data Engineering Solutions Using Fabric 2025

    Dp-700: Implementing Data Engineering Solutions Using Fabric
    Published 1/2025
    MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
    Language: English | Size: 1.59 GB | Duration: 4h 28m

    Build on your existing DP-600 skills, learn how to manipulate PySpark dataframes in notebooks.

    What you'll learn

    Implement and manage an analytics solution

    Configure security and governance

    Ingest and transform data

    Monitor and optimize an analytics solution

    Requirements

    You will need to already be familiar with all of the requirements of Microsoft's DP-600 exam.

    This includes SQL and KQL

    Description

    This course covers the additional content required for the DP-700 "Fabric Data Engineer Engineer Associate" certification exam, building on your existing knowledge gained for the DP-600 exam.First of all, we will take a quick look around Fabric,Then we will look at using data pipelines - ingesting and copying data, and scheduling and monitoring data pipeline runs.Most of this Part 1 course is about manipulating data using PySpark and SQL.We'll have a look at loading and saving data using notebooks.We'll then manipulating dataframes, by choosing which columns and rows to show.We'll then convert data types, aggregating and sorting dataframes,We will then be transforming data in a lakehouse, merging and joining data, together with identifying missing data or null values.We will then be creating objects, such as shortcuts and file partitioning.We will optimize performance in dataflows and notes.Finally, we will look at other Fabric topics, including recommending settings in the Fabric admin portal.Prior knowledge of all of the topics in the DP-600 exam is assumed. This content is available in "DP-600: Implement Analytics Solutions using Microsoft Fabric", which is available on Udemy.Once you have completed the course, you will have a good knowledge of using notebooks to manipulate data using PySpark. And with some practice and knowledge of some additional topics, you could even go for the official Microsoft certification DP-700 - wouldn't the "Microsoft Certified: Fabric Data Engineer Associate" certification look good on your CV or resume?I hope to see you in the course - why not have a look at what you could learn?

    Overview

    Section 1: Introduction

    Lecture 1 Introduction

    Lecture 2 Welcome to Udemy

    Lecture 3 The Udemy Interface

    Lecture 4 Do you want auto-translated subtitles in more languages?

    Lecture 5 Curriculum

    Lecture 6 Resources

    Section 2: A look around Fabric

    Lecture 7 Creating a Fabric capacity and configure Fabric-enabled workspace settings

    Lecture 8 Identify requirements for a Fabric solution and manage Fabric capacity

    Lecture 9 A quick tour of Fabric

    Section 3: Using data pipelines

    Lecture 10 24. Ingest data by using a data pipeline, and adding other activities

    Lecture 11 24. Copy data by using a data pipeline

    Lecture 12 Schedule data pipelines and monitor data pipeline runs

    Section 4: Loading and saving data using notebooks

    Lecture 13 Ingesting data into a lakehouse using a local upload

    Lecture 14 Choose an appropriate method for copying to a Lakehouse or Warehouse

    Lecture 15 Ingesting data using a notebook, and copying to a table

    Lecture 16 Saving data to a file or Lakehouse table

    Lecture 17 Loading data from a table in PySpark and SQL, and manipulating the results

    Lecture 18 Practice Activity Number 1

    Lecture 19 Practice Activity Number 1 - The Solution

    Section 5: 25. Manipulating dataframes - choosing columns and rows

    Lecture 20 Reducing the number of columns shown

    Lecture 21 Filtering data with: where, limit and tail

    Lecture 22 Enriching data by adding new columns

    Lecture 23 Using Functions

    Lecture 24 More advanced filtering

    Section 6: 25. Converting data types, aggregating and sorting dataframes

    Lecture 25 Converting data types

    Lecture 26 Importing data using an explicit data structure

    Lecture 27 Formatting dates as strings

    Lecture 28 27. Aggregating and re-filtering data

    Lecture 29 Sorting the results

    Lecture 30 Using all 6 SQL Clauses

    Section 7: Transform data in a lakehouse

    Lecture 31 Merging data

    Lecture 32 28a. Identifying and resolving duplicate data

    Lecture 33 Joining data using an Inner join

    Lecture 34 Joining data using other joins

    Lecture 35 28b. Identifying missing data or null values

    Lecture 36 Practice Activity Number 6 - Implementing bridge tables for a lakehouse

    Lecture 37 Practice Activity Number 6 - Solution

    Lecture 38 Schedule notebooks

    Section 8: Transform data in a data warehouse

    Lecture 39 Implement Type 1 and Type 2 slowly changing dimensions - Theory

    Lecture 40 Implement Type 0 slowly changing dimensions - Practice Example

    Lecture 41 Implement Type 1 and Type 2 slowly changing dimensions - Practical Example

    Section 9: Create objects

    Lecture 42 22. Create and manage shortcuts

    Lecture 43 44. Implement file partitioning for analytics workloads using a pipeline

    Lecture 44 44. Implement file partitioning for analytics workloads - data is in a lakehouse

    Section 10: Optimize performance

    Lecture 45 39. Identify and resolve data loading performance bottlenecks in dataflows

    Lecture 46 39. Implement performance improvements in dataflows

    Lecture 47 40. Identify and resolve data loading performance bottlenecks in notebooks

    Lecture 48 40. Implement performance improvements in notebooks, inc. V-Order

    Lecture 49 44. Identify and resolve issues with Delta table file: optimized writes

    Section 11: Other Fabric topics

    Lecture 50 Recommend settings in the Fabric admin portal

    Lecture 51 Implement workspace and item-level access controls for Fabric items

    Lecture 52 Installing the Microsoft Fabric Capacity Metrics app

    Lecture 53 Using the Microsoft Fabric Capacity Metrics app - Manage Fabric capacity

    Section 12: Congratulations for completing the course

    Lecture 54 What's Next?

    Lecture 55 Congratulations for completing the course

    This course is for you if you want to implement data engineering solutions using Microsoft Fabric,You will able to able to use PySpark to query streaming data,By the end of this course, after entering the official Practice Tests, you could enter (and hopefully pass) Microsoft's official DP-700 exam.,Wouldn't the "Implementing Data Engineering Solutions Using Microsoft Fabric" certification look good on your CV or resume?