* Cantinho Satkeys

Refresh History
  • FELISCUNHA: ghyt74  pessoal  4tj97u<z
    21 de Abril de 2025, 10:38
  • cereal killa:
    19 de Abril de 2025, 21:17
  • j.s.: tenham uma Santa e Feliz Páscoa  49E09B4F 49E09B4F 49E09B4F
    19 de Abril de 2025, 18:19
  • j.s.:
    19 de Abril de 2025, 18:19
  • j.s.: dgtgtr a todos  4tj97u<z 4tj97u<z
    19 de Abril de 2025, 18:15
  • FELISCUNHA: Uma santa sexta feira para todo o auditório  4tj97u<z
    18 de Abril de 2025, 11:12
  • JPratas: try65hytr Pessoal  4tj97u<z classic k7y8j0
    18 de Abril de 2025, 03:28
  • cereal killa: try65hytr malta  classic 2dgh8i
    14 de Abril de 2025, 23:14
  • FELISCUNHA: Votos de um santo domingo para todo o auditório  101041
    13 de Abril de 2025, 11:45
  • j.s.: e um bom domingo de Ramos  43e5r6 43e5r6
    11 de Abril de 2025, 21:02
  • j.s.: tenham um excelente fim de semana  49E09B4F
    11 de Abril de 2025, 21:01
  • j.s.: try65hytr a todos  4tj97u<z
    11 de Abril de 2025, 21:00
  • JPratas: try65hytr  y5r6t Pessoal  classic k7y8j0
    11 de Abril de 2025, 04:15
  • JPratas: dgtgtr A Todos  4tj97u<z classic k7y8j0
    10 de Abril de 2025, 18:29
  • FELISCUNHA: ghyt74  pessoal   49E09B4F
    09 de Abril de 2025, 11:59
  • cereal killa: try65hytr pessoal  2dgh8i
    08 de Abril de 2025, 23:21
  • FELISCUNHA: Votos de um santo domingo para todo o auditório  43e5r6
    06 de Abril de 2025, 11:13
  • cccdh: Ola para todos!
    04 de Abril de 2025, 23:41
  • j.s.: tenham um excelente fim de semana  49E09B4F
    04 de Abril de 2025, 21:10
  • j.s.: try65hytr a todos  4tj97u<z
    04 de Abril de 2025, 21:10

Autor Tópico: Exploring the Apache Beam SDK for Modeling Streaming Data for Processing  (Lida 117 vezes)

0 Membros e 1 Visitante estão a ver este tópico.

Online mitsumi

  • Moderador Global
  • ***
  • Mensagens: 119150
  • Karma: +0/-0

Exploring the Apache Beam SDK for Modeling Streaming Data for Processing
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 3h 28m | 613 MB
Instructor: Janani Ravi

Apache Beam is an open-source unified model for processing batch and streaming data in a parallel manner. Built to support Google's Cloud Dataflow backend, Beam pipelines can now be executed on any supported distributed processing backends.

Apache Beam SDKs can represent and process both finite and infinite datasets using the same programming model. All data processing tasks are defined using a Beam pipeline and are represented as directed acyclic graphs. These pipelines can then be executed on multiple execution backends such as Google Cloud Dataflow, Apache Flink, and Apache Spark.

In this course, Exploring the Apache Beam SDK for Modeling Streaming Data for Processing, we will explore Beam APIs for defining pipelines, executing transforms, and performing windowing and join operations.

First, you will understand and work with the basic components of a Beam pipeline, PCollections, and PTransforms. You will work with PCollections holding different kinds of elements and see how you can specify the schema for PCollection elements. You will then configure these pipelines using custom options and execute them on backends such as Apache Flink and Apache Spark.

Next, you will explore the different kinds of core transforms that you can apply to streaming data for processing. This includes the ParDo and DoFns, GroupByKey, CoGroupByKey for join operations and the Flatten and Partition transforms.

You will then see how you can perform windowing operations on input streams and apply fixed windows, sliding windows, session windows, and global windows to your streaming data. You will use the join extension library to perform inner and outer joins on datasets.

Finally, you will configure metrics that you want tracked during pipeline execution including counter metrics, distribution metrics, and gauge metrics, and then round this course off by executing SQL queries on input data.

When you are finished with this course you will have the skills and knowledge to perform a wide range of data processing tasks using core Beam transforms and will be able to track metrics and run SQL queries on input streams.

Download link:
Só visivel para registados e com resposta ao tópico.

Only visible to registered and with a reply to the topic.

Links are Interchangeable - No Password - Single Extraction