Job Description and Responsibilities:
• Experience of at least 3 years in architecting Big Data solution at enterprise scale with at least one end to end implementation
• Strong understanding & experience of Hadoop eco system such as HDFS, MapReduce, Yarn, Spark, Scala, Hive, HBase, Phoenix, Zookeeper, Pig, Hadoop streaming, Sqoop
• Knowledge of Hadoop Security, Data Management and Governance
• Ability to articulate pros & cons of “TO-BE” design/architecture decisions across a wide spectrum of factors
• Work closely with Operations team to size, scale and tune existing and new architecture.
• Experience working in core development Big Data projects, Should be able to perform hands-on development activities particularly in Spark, HBase / Cassandra, Hive, Shell Scripts.
• Responsible for designing, developing, testing, tuning and building a large-scale data processing system
• Troubleshoot and develop on Hadoop technologies including HDFS, Hive, HBase, Phoenix, Spark, Scala Map Reduce and Hadoop ETL development via tools such as Talend
• Must have strong knowledge on Hadoop security components like Kerberos, SSL, and Encryption using TDE etc.
• Ability to engage in senior-level technology discussions.
• The ideal candidate is pro-active, shows an ability to see the big picture and can prioritize the right work items in order to optimize the overall team output.
• Should have worked in agile environments and good to have exposure to Devops
• Excellent oral and written communications skills
• Hadoop Certified Developer/Architect will have added advantage
Does Not Matter
5 Lakhs to 9 Lakhs
3 Lakhs to 7 Lakhs