Flume and Sqoop for Ingesting Big Data

Import data to HDFS, HBase and Hive from a variety of sources, including Twitter and MySQL

Course Description
Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems.
Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase.
Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive.

Learning Outcomes

Use Flume to ingest data to HDFS and HBase
Use Sqoop to import data from MySQL to HDFS and Hive
Ingest data from a variety of sources including HTTP, Twitter and MySQL

Pre-requisites:

Knowledge of HDFS is a prerequisite for the course
HBase and Hive examples assume basic understanding of HBase and Hive shells
HDFS is required to run most of the examples, so you'll need to have a working installation of HDFS

Who is this course intended for?

Engineers building an application with HDFS/HBase/Hive as the data store

Engineers who want to port data from legacy data stores to HDFS


Frequently Asked Questions


When does the course start and finish?
The course starts now and never ends! It is a completely self-paced online course - you decide when you start and when you finish.
How long do I have access to the course?
How does lifetime access sound? After enrolling, you have unlimited access to this course for as long as you like - across any and all devices you own.
What if I am unhappy with the course?
We would never want you to be unhappy! If you are unsatisfied with your purchase, contact us in the first 30 days and we will give you a full refund.

Get started now!