Apache Flume Tutorial
Description
Flume is a standard, simple, robust, flexible, and extensible tool for data ingestion from various data producers (webservers) into Hadoop. In this tutorial, we will be using simple and illustrative example to explain the basics of Apache Flume and how to use it in practice.
This tutorial is meant for all those professionals who would like to learn the process of transferring log and streaming data from various webservers to HDFS or HBase using Apache Flume.
To make the most of this tutorial, you should have a good understanding of the basics of Hadoop and HDFS commands.
Course Content
Instructor
W3edify Infotech
President of Sales
You May Like
10,000+ unique online course list designs
Student Feedback