Designation – Technical Lead – Engineering
Location – Hyderabad
About employer– Cafyne
Job description:
Responsibilities
- This role will be responsible for defining, designing and developing Big Data Application using Java and Hadoop or similar Big Data Technologies, specially using Hadoop [Map/Reduce] by leveraging frameworks such as Cascading and cloud services by Rackspace and Amazon EC2 or similar Environment
- Design and develop new features and enhancements on our Big Data Analytics platform
- Collaborate with business and technical stakeholders to clearly understand business objectives, and product requirements and Translate requirements to technical solutions
- Drive technical due diligence of technologies, 3rd party applications and vendors and make recommendations
- Participate with business and engineering team members in the development of new software projects
- Develop application specifications and designs which are scalable, extensible, maintainable and testable
- Build systems, libraries, and frameworks within, around, and on top of Hadoop or similar
- Utilize frameworks and extensions to Hadoop such as Cascading
- Design and implement Map/Reduce jobs to support distributed data processing
- Process large data sets utilizing Hadoop clusters or similar
- Implement and test by authoring automated unit and black-box tests
- Work in small teams where each team member has a lot of ownership and each individual can make a big impact
- Provide responsive user training and support as required
- Provide technical guidance and mentoring to other staff members if needed
Qualification and Skills Required
- High proficiency in Java and J2EE technologies. Spring 2 / JPA experience is highly preferred.
- Experience with Hadoop and MapReduce technologies (e.g. Avro, Oozie, Pig, Hive, HBase, HDFS)
- Strong Database, SQL, ETL, NOSQL and data analytical skills
- Preferably Experience in Hadoop development and other BigData technologies such as Storm, Kofka, Lucence/SOLR, Membase, Dremel, Big Query, R, Mango DB and Experience with automated testing methods including Unit.
- Strong XML processing experience such as XSD, XPATH, XSL, XSLT, etc., Experience in Web Services / SOAP / REST / XML RPC experience
- Solid understanding of Big Data concepts Especially in Social media data and firsthand experience with data warehousing concepts/methodologies
- Must have practical experience with design patterns. Especially, MVC-type frameworks.
- Strong interest in Algorithms and Excellent Analytical Capabilities
- Ability to communicate design rationale and build consensus
- Comfortable working in a Linux development environment
- Experience in one or more of the following are pluses: SOA, BI, Mobile platform, NLP, Data Analytics Algorithms, Scrum development process
- Work effectively individually as well as collaboratively
- Self start, fast learner and hard working
- Readiness to work and collaborate with remote teams
- Great verbal and oral communication skills
- Passion for Development
- B Tech /M tech /MS in Computer Science or Equivalent.
- 7+ years of overall software development experience.
- Certification in Big Data technologies is a Strong Plus.
- Mandatory skills: Very good experience in design and development of web applications using Java, J2EE, soap, REST, NOSQL and Big data technologies.
- Preferred skills: Hadoop and MapReduce technologies, HDFS, HBase, Pig, Hive, MangoDB, Nutch, flume, Storm, Kofka, Lucence/SOLR, Membase, Dremel, Big Query, R.
- Other advantageous skills: Social media, Facebook API, Twitter API, Analytics, Visualisation, Charts, Graphs, background, product development background.
Interested people can apply for this job at this PAGE