Master Big Data Ingestion and Analytics with Flume, Sqoop, Hive and Spark
MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 5 Hours 40M | 1.29 GB
Genre: eLearning | Language: English
In this course, you will start by learning about the Hadoop Distributed File System (HDFS) and the most common Hadoop commands required to work with HDFS. Then, you'll be introduced to Sqoop Import, through which will gain knowledge of the lifecycle of the Sqoop command and how to use the import command to migrate data from Mysql to HDFS, and from Mysql to Hive-and much more.
In addition, you will learn about Sqoop Export to migrate data effectively, and about Apache Flume to ingest data. The section Apache Hive introduces Hive, alongside external and managed tables; working with different files, and Parquet and Avro-and more. You will learn about Spark Dataframes, Spark SQL and lot more in the last sections.
All the codes and supporting files are available at:
Só visivel para registados e com resposta ao tópico.Only visible to registered and with a reply to the topic. Download link:
Só visivel para registados e com resposta ao tópico.Only visible to registered and with a reply to the topic.Links are Interchangeable - No Password - Single Extraction