Devops : Data Production Engineer
(4+ years exp)Published: 1 week ago
Niyo Solutions
Making Banking Simpler, Safer, and Smarter for all
Job Location
Job Type
Full TimeVisa Sponsorship
Not AvailableRelocation
AllowedSkills
Linux
Kubernetes
Big Data Hadoop
Automation CI/CD
Hiring contact
Vibhav Parameswara CharyThe Role
Role & Responsibilities:
- Work with Data Engineer/Data Architect to participate in POC on New Tools and technologies related to Big Data Stack .
- Implement kafka as a service
- Building enterprise monitoring solutions, implementing configuration management.
- Managing distributed systems, clusters and tuning the same.
- Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
- Software installation and configuration.
- Automate manual tasks.
- Disk space management, Software patches and upgrades.
What is needed:
- Broad knowledge of infrastructure including compute network and storage .
- Minimum Bachelor's degree in computer science or a related field.
- Minimum of 2-3 years' experience on Big Data technologies with expertise in S3, Yarn, Spark, Hive/Impala, Kafka, Presto, Oozie, Notebook tools like Jupyter and Zeppelin.
- Minimum of 4 years of DevOps or development and operations combined experience .
- Hands-on experience managing distributed systems, clusters and tuning the same.
- Expertise in scripting technologies like Python.
- Understanding of Java and microservices architecture.
- Experience analyzing and resolving performance, scalability and reliability issues.
- Experience working in AWS/Azure
- Work with Data Engineer/Data Architect to participate in POC on New Tools and technologies related to Big Data Stack .
- Implement kafka as a service
- Building enterprise monitoring solutions, implementing configuration management.
- Managing distributed systems, clusters and tuning the same.
- Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
- Software installation and configuration.
- Automate manual tasks.
- Disk space management, Software patches and upgrades.
What is needed:
- Broad knowledge of infrastructure including compute network and storage .
- Minimum Bachelor's degree in computer science or a related field.
- Minimum of 2-3 years' experience on Big Data technologies with expertise in S3, Yarn, Spark, Hive/Impala, Kafka, Presto, Oozie, Notebook tools like Jupyter and Zeppelin.
- Minimum of 4 years of DevOps or development and operations combined experience .
- Hands-on experience managing distributed systems, clusters and tuning the same.
- Expertise in scripting technologies like Python.
- Understanding of Java and microservices architecture.
- Experience analyzing and resolving performance, scalability and reliability issues.
- Experience working in AWS/Azure
More about Niyo Solutions
Founders
Vinay Bagri
CoFounder and CEO • 3 years • 9 years
Virender Bisht
Cofounder and CTO • 3 years • 9 years
Similar Jobs
GenY Labs
Empowered Marketing. Powered By Artificial Intelligence
Aindra Systems
AIndra Systems converges between AI and healthcare
Starlly Solutions
Let's Build Something Great Together
Bert Labs
Building new technology at the intersection of software and hardware
Bert Labs
Building new technology at the intersection of software and hardware
Vayavya Labs
Embedded Software, Automation, Virtualization
ConnectPlus
Community based CRM and engagement app for your true fans (SaaS)
CanvasJS
Beautiful HTML5 & JavaScript Charts
Adappt Intelligence
Sensors & analytics to improve real estate use, productivity, and employee experience