Location: Pune, India

About the role

You will have following responsibilities

  • Size, Provision, and Operate Big data clusters in the traditional Bare Metal and Cloud hosting environments
  • Develop automation scripts to deploy and secure the cluster
  • Perform baseline performance benchmarking and continuously tune the cluster to meet performance KPI’s
  • Ensure cluster security at all levels ( Kerberos, Rager, HDFS ACLs, Network security)
  • Deploy Hadoop ecosystem projects on the cluster following all the Industry standard best practices
  • Provide Level 3 support for any issues related to Big data cluster and Performance
  • Upgrade cluster configuration (Additional nodes, additional storage, additional memory, Software distributions)

Requirements

You should have

  • Strong background in Linux/Unix Administration
  • Experience with automation/configuration management
  • Ability to use a wide variety of open source technologies and cloud services (experience with AWS is a plus)
  • Some experience in Big Data stacks including Hadoop and Spark
  • Working experience in multicultural/cross-border business environment
  • The urge to solve problems.
  • The desire to learn.
  • Strong computer science fundamentals.
  • Very good problem solving skills and looking at a problem in different perspective.

Apply

If this excites you, please fill up your information and upload your resume on the link below and we will take it ahead. We will get in touch with you for further discussion.

Apply for this position