Learn everything about Analytics

Hadoop Architect – DataMetica – Pune (10+ years of experience)

SHARE
, / 0

Designation – Hadoop Architect

Location – Pune

About employer DataMetica

Job description:

We are looking for key addition to our engineering team for building scalable distributed data solutions using Hadoop. The Hadoop Architect would be driving solutions and product development which are relevant to the industry today.

Responsibilities

  • Hands on technical role; contribute to all phases of the software development lifecycle, including the analysis, architecture, design, implementation, and QA
  • Collaboration on requirements; work with the Engineering, Product Management and Client Success teams to define features to help improve results for our customers and ourselves alike
  • Partner with our Data Mining and Analytics team to do analysis, build predictive models & optimization algorithms, and run experiments
  • Work closely with Operations/IT to assist with requirements and design of Hadoop clusters to handle very large scale; help with troubleshooting operations issues

Qualification and Skills Required

The Hadoop Architect should have a solid background in the fundamentals of computer science, distributed computing, large scale data processing as well as mastery of database designs and data warehousing. The person should have a high degree of self motivation, an unwavering commitment to excellence, excellent work ethic, positive attitude, and is fun to work with.

  • Expertise in building massively scalable distributed data processing solutions with Hadoop, Hive & Pig
  • Proficiency with Big Data processing technologies (Hadoop, HBase, Flume, Oozie)
  • Deep experience with distributed systems, large scale non-relational data stores, map-reduce systems, data modeling, database performance, and multi-terabyte data warehouses
  • Experience in Data Analytics, Data mining, Predictive Modeling
  • Experience in building data pipelines and analysis tools using Java, Python
  • 10+ years hands on Java experience building scalable solutions
  • Experience building large-scale server-side systems with distributed ?processing algorithms.
  • Aptitude to independently learn new technologies
  • Strong problem solving skills
  • Experience designing or implementing systems which work with external vendors’ interfaces
  • Ability to communicate with internal teams.

Interested people can apply for this job can mail their CV to jobs@analyticsvidhya.com with subject as Hadoop Architect – DataMetica – Pune

If you want to stay updated on latest analytics jobs, follow our job postings on twitter or like our Careers in Analytics page on Facebook

Leave A Reply

Your email address will not be published.

Join world’s fastest growing Analytics Community
Receive awesome tips, guides, infographics and become expert at: