Hadoop is an open source software project that enables distributed processing of large datasets across clusters of commodity servers. It has been developed to scale up from a single server to thousands of machines, having a very high degree of fault tolerance.
Through this course, learners will gain the knowledge of architecting a project using Hadoop and its Ecosystem components. You will learn how to develop MapReduce programs to handle enormous amounts of data. The course also covers the core concepts of Hadoop. There will be a detailed analysis of topics like MapReduce, HIVE, PIG etc.
- Learn how to store, manage, retrieve and analyze Big Data on clusters of servers using the Hadoop ecosystem
- Learn to analyze large amounts of data to bring out insights
- Relevant examples and cases for better understanding
- Hands-on training and working on project at the end of the course
- 1 month
- 60 Hours of live online classes
Course Starts: 31st August, 2014
11:00 AM to 2:00 P.M.
Monday to Friday
Fees: – INR- 12,000
Hardware and Software Requirements:
- 64 bit or 64 bit ready PC/Laptop (Intel Core 2 Duo or above)
- 8 GB RAM (4 GB Minimum)
- 80 GB HDD (Minimum)
- Knowledge of programming in C++ or Java or any other Object Oriented Programming language is preferred.
Hadoop & hadoop technologies like:
- Charan Hassi- Jaypee Business Academy, New Delhi