Loader attribute of the E Learning Platform
Avail Flat 10% off on all courses | Utilise this to Up-Skill for best jobs of the industry Enroll Now

Apache Kafka Certification Training

385+ Learners

Apache Kafka Certification Training is mainly designed to understand the real-time streaming of data, Kafka architecture, Kafka cluster, Kafka producer, Kafka consumer and Kafka monitoring. The training includes the basic learning of core APIs and in-depth learning of Kafka Connect, Kafka streams, Kafka Integration with Hadoop, Storm and Spark.

Instructor led training provided by Stepleaf E-Learning Platform Instructor Led Training
Real time cases are given for students attending the online professional development courses Real Time Projects
Intertviews are scheduled after completing  Online Professional Development Courses Guaranteed Job Interviews
E-Learning Platform Flexible Schedule
E-Learning Platform LifeTime Free Upgrade
Stepleaf is the E-Learning Platform provides 24*7 customer support 24x7 Support

Apache Kafka Certification

Jul 18 base64:type251:e1NhdCxTdW59 (15 Weeks) Weekend Batch Filling Fast 08:30 PM  10:30 PM
Time schedule for Online Professional Development Courses

Can't find a batch you were looking for?

Course Price at

$ 459

About Course

About Course

Apache Kafka is a distributed streaming platform capable of handling high-velocity and high-volume of data. It is used to publish and subscribe to streams of records. Kafka is used to represent the log data structure very fast and it fits lots of web-scale companies and enterprises.

StepLeaf’s Apache Kafka Training Course covers the wide concepts of Kafka Architecture, Configuring Kafka Cluster, Kafka Producer, Kafka Consumer, Kafka Monitoring. Students are intrinsically motivated to learn each topic with real time examples 


Overview:

Apache Kafka is worth learning and investing one's time into. Kafka is used for operational monitoring data, so we use more real time examples to have statistical analysis of it. This course in nutshell trains you to move data between systems in any infrastructure.

Course Objectives 

The goal of this course includes 

1. Deep understanding of Cluster Architecture and API.

2. Find the workflow of Apache Kafka.

3. Set up and the integration with hadoop

4. Integrate Kafka with real time streaming systems like Spark & Storm

5. Use Kafka to produce and consume messages

6. Use each API in real time

7. Finally work on capstone project implementing Kafka in twitter, Flume, Hadoop and Storm

Why learn Apache Kafka ?

Kafka is an open-source system and learning through this course will enable you to understand the following features

1. Integration with Hadoop

2. Highly reliable messaging system

3. Amazing performance ability

4. Website activity tracking

5. Log aggregation

6. Stream Processing

7. Operational metrics


 Who should go for this Course?

  • Apache Kafka is a groundbreaking technology as it is a reliable way to move large amounts of data very quickly. It is used by many companies all around the world.
  • This course is mainly designed for the IT professionals who work with big data and hadoop technologies.

What are the Pre-requisites for this Course?

A little knowledge in core java is mandatory. StepLeaf provides a crash course to refresh the topics of JAVA.


Key Skills

apachekafka, bigdata, flume, Cassandra, Cluster, ZooKeeper, Kafka Producer, Kafka Internals, Stream Processing, Talend

Free Career Counselling
+91

Course Contents

Download Syllabus

Apache Kafra Certification Training

Goal: In this module, you will understand where Kafka fits in the Big Data space, and Kafka Architecture. In addition, you will learn about Kafka Cluster, its Components, and how to Configure a Cluster

Skills: 

• Kafka Concepts

• Kafka Installation

• Configuring Kafka Cluster

Objectives: At the end of this module, you should be able to:  

• Explain what is Big Data

• Understand why Big Data Analytics is important

• Describe the need of Kafka

• Know the role of each Kafka Components

• Understand the role of ZooKeeper

• Install ZooKeeper and Kafka

• Classify different type of Kafka Clusters

• Work with Single Node-Single Broker Cluster

Topics: 

• Introduction to Big Data

• Big Data Analytics

• Need for Kafka

• What is Kafka?

• Kafka Features

• Kafka Concepts

• Kafka Architecture

• Kafka Components

• ZooKeeper

• Where is Kafka Used?

• Kafka Installation

• Kafka Cluster

• Types of Kafka Clusters

• Configuring Single Node Single Broker Cluster


Goal: Kafka Producers send records to topics. The records are sometimes referred to as Messages. In this Module, you will work with different Kafka Producer APIs.
Skills:
• Configure Kafka Producer
• Constructing Kafka Producer
• Kafka Producer APIs
• Handling Partitions
Objectives:
At the end of this module, you should be able to:
• Construct a Kafka Producer
• Send messages to Kafka
• Send messages Synchronously & Asynchronously
• Configure Producers
• Serialize Using Apache Avro
• Create & handle Partitions
Topics:
• Configuring Single Node Multi Broker Cluster
• Constructing a Kafka Producer
• Sending a Message to Kafka
• Producing Keyed and Non-Keyed Messages
• Sending a Message Synchronously & Asynchronously
• Configuring Producers
• Serializers
• Serializing Using Apache Avro
• Partitions
Hands On:
• Working with Single Node Multi Broker Cluster
• Creating a Kafka Producer
• Configuring a Kafka Producer
• Sending a Message Synchronously & Asynchronously

Goal: Applications that need to read data from Kafka use a Kafka Consumer to subscribe to Kafka topics and receive messages from these topics. In this module, you will learn to construct Kafka Consumer, process messages from Kafka with Consumer, run Kafka Consumer and subscribe to Topics
Skills:
• Configure Kafka Consumer
• Kafka Consumer API
• Constructing Kafka Consumer
Objectives: At the end of this module, you should be able to:
• Perform Operations on Kafka
• Define Kafka Consumer and Consumer Groups
• Explain how Partition Rebalance occurs
• Describe how Partitions are assigned to Kafka Broker
• Configure Kafka Consumer
• Create a Kafka consumer and subscribe to Topics
• Describe & implement different Types of Commit
• Deserialize the received messages
Topics:
• Consumers and Consumer Groups
• Standalone Consumer
• Consumer Groups and Partition Rebalance
• Creating a Kafka Consumer
• Subscribing to Topics
• The Poll Loop
• Configuring Consumers
• Commits and Offsets
• Rebalance Listeners
• Consuming Records with Specific Offsets
• Deserializers
Hands-On:
• Creating a Kafka Consumer
• Configuring a Kafka Consumer
• Working with Offsets

Goal: Apache Kafka provides a unified, high-throughput, low-latency platform for handling real-time data feeds. Learn more about tuning Kafka to meet your high-performance needs.
Skills:
• Kafka APIs
• Kafka Storage
• Configure Broker
Objectives:
At the end of this module, you should be able to:
• Understand Kafka Internals
• Explain how Replication works in Kafka
• Differentiate between In-sync and Out-off-sync Replicas
• Understand the Partition Allocation
• Classify and Describe Requests in Kafka
• Configure Broker, Producer, and Consumer for a Reliable System
• Validate System Reliabilities
• Configure Kafka for Performance Tuning
Topics:
• Cluster Membership
• The Controller
• Replication
• Request Processing
• Physical Storage
• Reliability
• Broker Configuration
• Using Producers in a Reliable System
• Using Consumers in a Reliable System
• Validating System Reliability
• Performance Tuning in Kafka
Hands On:
• Create topic with partition & replication factor 3 and execute it on multi-broker cluster
• Show fault tolerance by shutting down 1 Broker and serving its partition from another broker

Goal: Kafka Cluster typically consists of multiple brokers to maintain load balance. ZooKeeper is used for managing and coordinating Kafka broker. Learn about Kafka Multi-Cluster Architectures, Kafka Brokers, Topic, Partitions, Consumer Group, Mirroring, and ZooKeeper Coordination in this module.
Skills:
• Administer Kafka
Objectives:
At the end of this module, you should be able to
• Understand Use Cases of Cross-Cluster Mirroring
• Learn Multi-cluster Architectures
• Explain Apache Kafka’s MirrorMaker
• Perform Topic Operations
• Understand Consumer Groups
• Describe Dynamic Configuration Changes
• Learn Partition Management
• Understand Consuming and Producing
• Explain Unsafe Operations
Topics:
• Use Cases - Cross-Cluster Mirroring
• Multi-Cluster Architectures
• Apache Kafka’s MirrorMaker
• Other Cross-Cluster Mirroring Solutions
• Topic Operations
• Consumer Groups
• Dynamic Configuration Changes
• Partition Management
• Consuming and Producing
• Unsafe Operations
Hands on:
• Topic Operations
• Consumer Group Operations
• Partition Operations
• Consumer and Producer Operations

Goal: Learn about the Kafka Connect API and Kafka Monitoring. Kafka Connect is a scalable tool for reliably streaming data between Apache Kafka and other systems.
Skills:
• Kafka Connect
• Metrics Concepts
• Monitoring Kafka
Objectives: At the end of this module, you should be able to:
• Explain the Metrics of Kafka Monitoring
• Understand Kafka Connect
• Build Data pipelines using Kafka Connect
• Understand when to use Kafka Connect vs Producer/Consumer API
• Perform File source and sink using Kafka Connect
• Topics:
• Considerations When Building Data Pipelines
• Metric Basics
• Kafka Broker Metrics
• Client Monitoring
• Lag Monitoring
• End-to-End Monitoring
• Kafka Connect
• When to Use Kafka Connect?
• Kafka Connect Properties
Hands on:
• Kafka Connect

Goal: Learn about the Kafka Streams API in this module. Kafka Streams is a client library for building mission-critical real-time applications and microservices, where the input and/or output data is stored in Kafka Clusters.
Skills:
• Stream Processing using Kafka
Objectives:
• At the end of this module, you should be able to,
• Describe What is Stream Processing
• Learn Different types of Programming Paradigm
• Describe Stream Processing Design Patterns
• Explain Kafka Streams & Kafka Streams API
Topics:
• Stream Processing
• Stream-Processing Concepts
• Stream-Processing Design Patterns
• Kafka Streams by Example
• Kafka Streams: Architecture Overview
Hands on:
• Kafka Streams
• Word Count Stream Processing

Goal: In this module, you will learn about Apache Hadoop, Hadoop Architecture, Apache Storm, Storm Configuration, and Spark Ecosystem. In addition, you will configure Spark Cluster, Integrate Kafka with Hadoop, Storm, and Spark.
Skills:
• Kafka Integration with Hadoop
• Kafka Integration with Storm
• Kafka Integration with Spark
Objectives:
At the end of this module, you will be able to:
• Understand What is Hadoop
• Explain Hadoop 2.x Core Components
• Integrate Kafka with Hadoop
• Understand What is Apache Storm
• Explain Storm Components
• Integrate Kafka with Storm
• Understand What is Spark
• Describe RDDs
• Explain Spark Components
• Integrate Kafka with Spark
 Topics:
• Apache Hadoop Basics
• Hadoop Configuration
• Kafka Integration with Hadoop
• Apache Storm Basics
• Configuration of Storm
• Integration of Kafka with Storm
• Apache Spark Basics
• Spark Configuration
• Kafka Integration with Spark
Hands On:
• Kafka integration with Hadoop
• Kafka integration with Storm
• Kafka integration with Spark

Goal: Learn how to integrate Kafka with Flume, Cassandra and Talend.
Skills:
• Kafka Integration with Flume
• Kafka Integration with Cassandra
• Kafka Integration with Talend
 Objectives:
At the end of this module, you should be able to,
• Understand Flume
• Explain Flume Architecture and its Components
• Setup a Flume Agent
• Integrate Kafka with Flume
• Understand Cassandra
• Learn Cassandra Database Elements
• Create a Keyspace in Cassandra
• Integrate Kafka with Cassandra
• Understand Talend
• Create Talend Jobs
• Integrate Kafka with Talend
Topics:
• Flume Basics
• Integration of Kafka with Flume
• Cassandra Basics such as and KeySpace and Table Creation
• Integration of Kafka with Cassandra
• Talend Basics
• Integration of Kafka with Talend
Hands On:
• Kafka demo with Flume
• Kafka demo with Cassandra
• Kafka demo with Talend

This Project enables you to gain Hands-On experience on the concepts that you have learned as part of this Course.
You can email the solution to our Support team within 2 weeks from the Course Completion Date. Edureka will evaluate the solution and award a Certificate with a Performance-based Grading.
Problem Statement:
You are working for a website techreview.com that provides reviews for different technologies. The company has decided to include a new feature in the website which will allow users to compare the popularity or trend of multiple technologies based on twitter feeds. They want this comparison to happen in real time. So, as a big data developer of the company, you have been task to implement following things:
• Near Real Time Streaming of the data from Twitter for displaying last minute's count of people tweeting about a particular technology.
• Store the twitter count data into Cassandra.

Like the curriculum? Enroll Now

Structure your learning and get a certificate to prove it.

+91
Two persons discussing about the online developemnet courses

Projects

What are the system requirements for this course?

  • Minimum RAM required: 4GB (Suggested: 8GB)
  • Minimum Free Disk Space: 25GB
  • Minimum Processor i3 or above
  • Operating System of 64bit
  • Participant’s machines must support a 64-bit VirtualBox guest image.

How will I execute the practicals?

  • We will help you to setup StepLeaf's Virtual Machine in your System with local access. The detailed installation guides are provided in the LMS for setting up the environment. For any doubt, the 24*7 support team will promptly assist you. StepLeaf Virtual Machine can be installed on Mac or Windows machine.

Which case studies will be a part of the course?

Project: #1

Domain: Finance 

Problem Statement: 

Your credit card is swiped and you receive the message, but you are not the one who used it. Detect Fraudulent transactions. 

Project: #2 

Domain:Retail 

Problem Statement: 

Collect a real time transaction detail from an e-commerce website and combine it with a social media profile and enhance the recommendation to customers about the product. 

Project: #3 

Domain: Healthcare 

Problem Statement: 

Analyse patient records from a healthcare center and find the patients who are likely to face health issues in near future using genomic sequencing. 

Project: #4 

Domain: Gaming 

Problem Statement: 

Using Apache Kafka identify the pattern from real-time game events and respond with auto-adjusting of gaming levels. 


Certification

StepLeaf’s Apache Kafka Professional Certificate Holders work at 1000s of MNC Companies All Over the World

FAQ

Online learning is a mixture of live tutoring and recorded videos. It helps you to complete on your own time and give much flexibility to the students. Finally, you can say that it just fits your needs.  


StepLeaf uses a blended learning technique which consists of auditory, visual, hands-on and much more technique at the same time. We assess both students and instructors to make sure that no one falls short of the course goal. 


The fee of each training course varies according to the curriculum and the duration preferred by the student. For further information please look into the link of the preferred course.  

Yes, we offer crash courses. You could get the overview of the whole course and can drive it within a short period of time.  

Currently we don't offer demo class as the number of students who attend the live sessions are limited. You could see our recorded video of the class in each course description page to get the insight of the class and the quality of our instructors.   

StepLeaf has a study repository where you can find the recorded video of each class and all other essential resources for the course. 

Each student who joins StepLeaf will be allocated with a learning manager to whom you can contact anytime to clarify your queries 

Yes we have a centralized study repository, where students can jump in and explore all the latest materials of latest technologies. 

Assessment is a continuous process in StepLeaf where a student's goal is clearly defined and identifies the learning outcome. We conduct weekly mock tests, so that students can find their shortfalls and improve them before the final certification exam.  

StepLeaf offers a discussion board where students can react to content, share challenges, teach each other and experiment their new skills.  

You can pay your course fee online quickly through secure Razorpay gateway. You will be able to track the payment details on the way.  

;
Bootstrap
Title