* Cantinho Satkeys

Refresh History
  • m1957: Por favor! Uma pequena ajuda, não deixem que o fórum ecerre. Obrigado!
    Hoje às 01:10
  • j.s.: [link]
    02 de Julho de 2025, 21:09
  • j.s.: h7t45 ao membro anónimo pela sua ajuda  49E09B4F
    02 de Julho de 2025, 21:09
  • j.s.: dgtgtr a todos  4tj97u<z
    01 de Julho de 2025, 17:18
  • FELISCUNHA: Votos de um santo domingo para todo o auditório  4tj97u<z
    29 de Junho de 2025, 11:59
  • m1957: Foi de boa vontade!
    28 de Junho de 2025, 00:39
  • j.s.: passem f.v. por aqui [link]    h7t45
    27 de Junho de 2025, 17:20
  • j.s.: renovamos o nosso pedido para uma pequena ajuda para pagemento  do nosso forum
    27 de Junho de 2025, 17:19
  • j.s.: h7t45 aos convidados de honra Felizcunha e M1957 pela ajuda
    27 de Junho de 2025, 17:15
  • j.s.: dgtgtr a todos  4tj97u<z
    27 de Junho de 2025, 17:13
  • FELISCUNHA: ghyt74  pessoal  4tj97u<z
    27 de Junho de 2025, 11:51
  • JPratas: try65hytr A Todos  classic k7y8j0
    27 de Junho de 2025, 04:35
  • m1957: Por favor vaamos todos dar uma pequena ajuda, para não deixar encerrar o fórum! Obrigado.
    26 de Junho de 2025, 23:45
  • FELISCUNHA: j.s. enviei PM  101041
    26 de Junho de 2025, 21:33
  • FELISCUNHA: try65hytr  pessoal   htg6454y
    26 de Junho de 2025, 21:33
  • JPratas: try65hytr Pessoal  4tj97u<z
    26 de Junho de 2025, 02:28
  • cereal killa: Boa Tarde Pessoal E com enorme tristeza que depois de 15 anos que idealizei e abri este fórum vejo que esta na iminência de fechar portas porque ninguém tenta ajudar o pagamento do servidor, mas cada ano e sempre difícil arranjar almas caridosas que nos bom ajudando mas este ano esta complicado, mas infelizmente e como diz o j.s dia 5/07 se não houver algumas ajudas esta vez vai mesmo fechar…..e pena e triste mas tudo na vida tem fim. obrigada cereal killa
    25 de Junho de 2025, 19:40
  • j.s.: [link]
    23 de Junho de 2025, 15:58
  • j.s.: a todos um excelente S. João
    23 de Junho de 2025, 15:48
  • j.s.: se não houver alteração ao nosso pedido
    23 de Junho de 2025, 15:46

Autor Tópico: Creating Data Pipelines Using Aws & Confluent Kafka Platform  (Lida 47 vezes)

0 Membros e 1 Visitante estão a ver este tópico.

Offline mitsumi

  • Sub-Administrador
  • ****
  • Mensagens: 121842
  • Karma: +0/-0

Creating Data Pipelines Using Aws & Confluent Kafka Platform
Published 5/2025
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 737.71 MB | Duration: 1h 12m

Hands on course to build Real Time Data streaming pipeline from Confluent, to Lambda to DynamoDB from scratch

What you'll learn
Get an high level overview of building Data Pipelines using confluent Kafka and AWS services - Lambda, DynamoDB
Be able to build End to End Real time streaming Data Pipeline using AWS + Confluent Platforms
Build understanding of popular AWS services like Lambda, DynamoDB, IAM, Cloudwatch
Hands on practice with streaming data from Confluent Kafka all the way to Database on AWS
Requirements
No programming experience needed, Be able to create free tier AWS and Confluent accounts for hands on practice
Description
In this hands-on, project-based course, you'll learn how to build a cloud-native, real-time streaming data pipeline using the powerful combination of Confluent Kafka and AWS services. Designed for developers, data engineers, and technology leaders, this course takes you step-by-step through the creation of a fully functional data pipeline - from sourcing to storage - leveraging some of the most in-demand technologies in the cloud ecosystem.You'll start by setting up free-tier accounts on both Confluent Cloud and AWS. From there, you'll build and configure a Kafka cluster, create topics, and use connectors to manage data flow. On the AWS side, you'll create and configure AWS Lambda functions (Python 3.10+) to consume messages from Kafka topics, parse them, and insert them into a DynamoDB table.By the end of the course, you will have completed an end-to-end, event-driven architecture with real-time streaming capabilities. We'll also walk through how to monitor and verify your pipeline using CloudWatch Logs, and responsibly clean up your resources to avoid unnecessary charges. This course will help build confidence on starting to use other AWS and Confluent services and build real time streaming applications in your future or current job role. As a Leader this course will help jump start your Cloud thought process and help you understand deeper details on what goes on building cloud native data pipelines.Whether you're a beginner, Data Engineer, Solution Architect, Product Owner, Product Manager or a Technology Leader looking to better understand streaming data architecture in action, this course provides both practical skills and architectural insights. This course will help those who are looking to switch careers & will help to add a real life project to their Resume and boost their hands-on AWS Cloud Technical & Architecture skills. This Course will help participants upskill themselves by understanding nuances up close, of cloud native real time streaming data pipeline data pipeline setup, its issues, risks, challenges, benefits & shortcomings.
Overview
Section 1: Introduction
Lecture 1 Introduction to Real Time Streaming Cloud Native Data Pipeline Architecture
Section 2: Confluent Cloud Account Setup
Lecture 2 Setup Confluent.io cloud Account
Section 3: Review Cluster on Confluent & Create Topic
Lecture 3 Confluent - Create Topic & API Key
Section 4: Create Data Gen Source Connector and load the Confluent Topic
Lecture 4 Create Data Gen Source Connector, load Topic & validate
Section 5: Create & Setup AWS Account
Lecture 5 Create & Setup AWS Account
Section 6: Create Lambda Function & Connect to Confluent Topic
Lecture 6 Initial Lambda Setup, Test & Cloudwatch Exploration
Lecture 7 Setup Lambda-Confluent connection (ESM), use Secrets Mgr & IAM role
Lecture 8 Test Data flow from Confluent source connector to Topic to AWS Lambda Function
Section 7: Parse event using Python Lambda code & write to CloudWatch Logs
Lecture 9 Parse event & write Data elements to Cloudwatch
Section 8: Create DynamoDB Table & Update Lambda function code to insert data into it
Lecture 10 Create Dynamo DB table & Explore NoSQL Capabilities
Lecture 11 Update Lambda function to insert incoming data to DynamoDB table
Section 9: Run the Data Pipeline End to End & Verify
Lecture 12 Run the Data Pipeline End to End and Validate all steps!
Section 10: Summary & Wrap Up!
Lecture 13 Summary & Wrap up!
Beginner Cloud Developers looking to build AWS + Confluent Kafka Skills,Entry Level Cloud & Data Engineers looking for a Real Life Project to add to their Resumes,Mid Level and Sr. Data Engineers who are looking to enhance their skills in Confluent + AWS Platform areas,Product Owners and Product Managers looking to understand capabilities of the Confluent + AWS 's Real Time Streaming Platforms,Technology Leaders looking to understand Capabilities, Shortcomings, efforts it takes to build Real Time Data Streaming Pipeline
Screenshots


Download link

rapidgator.net:
Citar
https://rapidgator.net/file/1b37c8b9ef9ef99400a507af892a6b23/spsee.Creating.Data.Pipelines.Using.Aws..Confluent.Kafka.Platform.rar.html

nitroflare.com:
Citar
https://nitroflare.com/view/D2DDBA4E1EB3406/spsee.Creating.Data.Pipelines.Using.Aws..Confluent.Kafka.Platform.rar