Data Engineer (Mid Level/Senior)

 (5+ years exp)
$30k – $80k
Published: 1 month ago
Avatar for Clairvoyant

Clairvoyant

Global Technology Consulting and Services Company

Job Location

Job Type

Full Time

Visa Sponsorship

Not Available

Hires remotely in

Relocation

Allowed

Skills

Java
SQL
Bash
Scala
RDBMS
Phyton
Docker
English language
AWS
AWS/GCP/Azure

The Role

Role: Data Engineer
Experience Levels:
Mid-level Engineer 5+ years of experience
Senior 8+ years of experience

Benefits:
Vacations (PTO)

15 days/year (6 mandatory + 9 complementary)
Vacation Premium
25% on top of the 100% salary of PTO days
Home Office Allowance
$50USD/month (internet & power) + office equipment (chair, monitor, screen, mouse)
T&E Business Expenses
T&E policy applicable for business expenses (e.g: mobile)
Health Insurance Covers USD 1M (copay 10%, deductible $750USD) for employee, partner, and child.
Life + AD&D Insurance
Covers 50K USD, only for employee
Employer Match for Retirement Plan
Employee contributes 6%/month + EXL matches 3% on March next year
Meal and Grocery
$50USD/month for meal and groceries
Guaranteed Bonus
Inclusive of Profit Sharing 3 months of base salary, distributed through the year and paid every 2 months
Christmas Bonus
1/2 month of salary, paid on December
Performance Bonus
As per band 5/15%, paid on March, Christmas bonus is deducted from target
IMSS Government
Social Security Benefits (e.g: short term disability)
Infonavit Affordable
Housing Fund
Afore Government
Public Pension Funds

Qualifications:
Bachelor’s Degree or working towards a bachelor’s degree in
Computer Science or a related field
5+ Years of working experience as Software Engineer
Willing to work with distributed teams in different time zones and
offshore teams
Ability to adapt quickly to new projects and technologies
Experience with programming in languages like Java, Python, Scala
and scripting tools like Bash, KSH
Strong SQL skills and good understanding of RDBMS systems
Experience with Hadoop ecosystem
HDFS
Hive
Yarn
File formats like Avro/Parquet
Kafka
Spark (PySpark, Scala and Java)
Spark Streaming (PySpark, Scala and Java)
Experience working on Modern Data Platforms on AWS
Glue
EMR
Redshift
Lambda
Kinesis / MSK
Experience working on containerization technologies e.g., docker,
Kubernetes
Good aptitude, strong problem-solving abilities, and analytical skills,
ability to take ownership as appropriate
Should good understanding for coding, debugging, performance
tuning, and deploying apps to a Production environment
Understanding of Agile Development methodologies
Experience with version control technologies like GIT
Excellent verbal skills
Ability to work with clients to understand requirements and business
needs
Ability to independently find solutions and fixes to problems
Drive and passion to learn new things

Nice to Have:
Experience working on other cloud platform GCP or Azure
Experience working with Snowflake DB