[su_tabs]
[su_tab title=”Description”]
Hadoop is an open source software project that enables distributed processing of large datasets across clusters of commodity servers. It has been developed to scale up from a single server to thousands of machines, having a very high degree of fault tolerance.
[/su_tab]
[su_tab title=”Program Structure”]
Through this course, learners will gain the knowledge of architecting a project using Hadoop and its Ecosystem components. You will learn how to develop MapReduce programs to handle enormous amounts of data. The course also covers the core concepts of Hadoop. There will be a detailed analysis of topics like MapReduce, HIVE, PIG etc.
Key Takeaways:
- Learn how to store, manage, retrieve and analyze Big Data on clusters of servers using the Hadoop ecosystem
- Learn to analyze large amounts of data to bring out insights
- Relevant examples and cases for better understanding
- Hands-on training and working on project at the end of the course
Duration
- 1 month
- 60 Hours of live online classes
Important Date:
Course Starts: 31st August, 2014
Class Timings:
11:00 AM to 2:00 P.M.
Classes on:
Monday to Friday
Fees: – INR- 12,000
Hardware and Software Requirements:
- 64 bit or 64 bit ready PC/Laptop (Intel Core 2 Duo or above)
- 8 GB RAM (4 GB Minimum)
- 80 GB HDD (Minimum)
[/su_tab]
[su_tab title=”Eligibility”]
- Knowledge of programming in C++ or Java or any other Object Oriented Programming language is preferred.
[/su_tab]
[su_tab title=”Tools”]
Hadoop & hadoop technologies like:
- MapReduce
- Hive
- Pig
- SaaS
- PaaS
- LaaS
- HBase
- Zookeeper
- Sqoop
- Oozie
- Flume
[/su_tab]
[su_tab title=”Faculty”]
- Charan Hassi- Jaypee Business Academy, New Delhi
[/su_tab]
[su_tab title=”Contact”]
[/su_tab]
[/su_tabs]