Hadoop Administrator

115k – 145k AngelList Est.
Apply now
Hadoop Administrator

Position Summary:
- We are looking for well experienced Hadoop Administrator who has strong skill sets on Cloudera or Hortonworks Hadoop ecosystem over 4 years.
- The person needs to have the strong passion and competence to solve technical challenges, deploy and administer Hadoop clusters around the world for various services of Rakuten.
- Our Big Data platform captures and generates terabytes of data every day. This data needs to be streamed, processed, curated and accessed by hundreds of services and business people.

- Administration
- Monitoring, tuning and improving big Hadoop clusters to keep them healthy.
- Verifying and enabling new Hadoop related components.
- Regular operation such as server adding, failed disk replacement, etc.
- Trouble shooting for all problems from Hadoop cluster side.
- User support
- Work together to solve Hadoop users' problems.
- Help and train Hadoop users to use our Hadoop eco-system.

Minimum Requirements:
- Operating experiences as a Hadoop Administrator for Cloudera or Hortonworks Hadoop ecosystem at least 4 years.
- Operating experiences to manage Hbase as an administrator including fixing Hbase particular problems.
- Deep understanding and strong conceptual knowledge in Hadoop architecture components.
- Strong hands-on experience and knowledge of Hadoop core components such as HDFS, YARN, Hive, etc.
- Hands-on experience and knowledge of Linux and Hardware.
- Strong hands-on programming experience in Shell Scripting.
- Hands-on programming experience in Java.
- Basic experience and knowledge of one of automation tools such as Chef, Puppet, Ansible.
- Strong analytical mind to help solve complicated problems.
- Desire to resolve issues and dive into potential issue.
- Self-starter who works with minimal supervision. Ability to work in a team of diverse skill sets.
- Ability to comprehend customer requests & provide the correct solution.

Preferred Qualifications:
- Any contribution to Hadoop open community (Experience as a Contributor or as a Committer in Hadoop community.)
- Experience on tools like Hive, Spark, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce, etc.
- Experience in end to end design and build process of Near-Real time and Batch Data Pipelines.
- Strong Hands-on programming experience in Java, Scala, Python.
- Experience and knowledge of automation tools for server provisioning such as Chef(Ruby), Puppet, Ansible.
- Strong hands-on experience and knowledge of Linux and Hardware.

More jobs at Rakuten Americas

View all jobs

Senior Dev Ops Engineer

Apply now

Cloud Security Engineer

Apply now

JavaScript Software Developer

Apply now

Software Engineer (Boston)

Apply now

Software Engineer (Python)

Apply now