Tags
Language
Tags
May 2025
Su Mo Tu We Th Fr Sa
27 28 29 30 1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31

Creating Data Pipelines Using Aws & Confluent Kafka Platform

Posted By: ELK1nG
Creating Data Pipelines Using Aws & Confluent Kafka Platform

Creating Data Pipelines Using Aws & Confluent Kafka Platform
Published 5/2025
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 737.71 MB | Duration: 1h 12m

Hands on course to build Real Time Data streaming pipeline from Confluent, to Lambda to DynamoDB from scratch

What you'll learn

Get an high level overview of building Data Pipelines using confluent Kafka and AWS services - Lambda, DynamoDB

Be able to build End to End Real time streaming Data Pipeline using AWS + Confluent Platforms

Build understanding of popular AWS services like Lambda, DynamoDB, IAM, Cloudwatch

Hands on practice with streaming data from Confluent Kafka all the way to Database on AWS

Requirements

No programming experience needed, Be able to create free tier AWS and Confluent accounts for hands on practice

Description

In this hands-on, project-based course, you’ll learn how to build a cloud-native, real-time streaming data pipeline using the powerful combination of Confluent Kafka and AWS services. Designed for developers, data engineers, and technology leaders, this course takes you step-by-step through the creation of a fully functional data pipeline — from sourcing to storage — leveraging some of the most in-demand technologies in the cloud ecosystem.You’ll start by setting up free-tier accounts on both Confluent Cloud and AWS. From there, you’ll build and configure a Kafka cluster, create topics, and use connectors to manage data flow. On the AWS side, you’ll create and configure AWS Lambda functions (Python 3.10+) to consume messages from Kafka topics, parse them, and insert them into a DynamoDB table.By the end of the course, you will have completed an end-to-end, event-driven architecture with real-time streaming capabilities. We’ll also walk through how to monitor and verify your pipeline using CloudWatch Logs, and responsibly clean up your resources to avoid unnecessary charges. This course will help build confidence on starting to use other AWS and Confluent services and build real time streaming applications in your future or current job role. As a Leader this course will help jump start your Cloud thought process and help you understand deeper details on what goes on building cloud native data pipelines.Whether you're a beginner, Data Engineer, Solution Architect, Product Owner, Product Manager or a Technology Leader looking to better understand streaming data architecture in action, this course provides both practical skills and architectural insights. This course will help those who are looking to switch careers & will help to add a real life project to their Resume and boost their hands-on AWS Cloud Technical & Architecture skills. This Course will help participants upskill themselves by understanding nuances up close, of cloud native real time streaming data pipeline data pipeline setup, its issues, risks, challenges, benefits & shortcomings.

Overview

Section 1: Introduction

Lecture 1 Introduction to Real Time Streaming Cloud Native Data Pipeline Architecture

Section 2: Confluent Cloud Account Setup

Lecture 2 Setup Confluent.io cloud Account

Section 3: Review Cluster on Confluent & Create Topic

Lecture 3 Confluent - Create Topic & API Key

Section 4: Create Data Gen Source Connector and load the Confluent Topic

Lecture 4 Create Data Gen Source Connector, load Topic & validate

Section 5: Create & Setup AWS Account

Lecture 5 Create & Setup AWS Account

Section 6: Create Lambda Function & Connect to Confluent Topic

Lecture 6 Initial Lambda Setup, Test & Cloudwatch Exploration

Lecture 7 Setup Lambda-Confluent connection (ESM), use Secrets Mgr & IAM role

Lecture 8 Test Data flow from Confluent source connector to Topic to AWS Lambda Function

Section 7: Parse event using Python Lambda code & write to CloudWatch Logs

Lecture 9 Parse event & write Data elements to Cloudwatch

Section 8: Create DynamoDB Table & Update Lambda function code to insert data into it

Lecture 10 Create Dynamo DB table & Explore NoSQL Capabilities

Lecture 11 Update Lambda function to insert incoming data to DynamoDB table

Section 9: Run the Data Pipeline End to End & Verify

Lecture 12 Run the Data Pipeline End to End and Validate all steps!

Section 10: Summary & Wrap Up!

Lecture 13 Summary & Wrap up!

Beginner Cloud Developers looking to build AWS + Confluent Kafka Skills,Entry Level Cloud & Data Engineers looking for a Real Life Project to add to their Resumes,Mid Level and Sr. Data Engineers who are looking to enhance their skills in Confluent + AWS Platform areas,Product Owners and Product Managers looking to understand capabilities of the Confluent + AWS 's Real Time Streaming Platforms,Technology Leaders looking to understand Capabilities, Shortcomings, efforts it takes to build Real Time Data Streaming Pipeline