ETL Data Scientist
(3+ years exp)Commos
Job Type
ContractVisa Sponsorship
Not AvailableHires remotely in
Relocation
AllowedSkills
Hiring contact
Nicolas BenjaminThe Role
We are building Commos out of our love for knowledge and our optimism for the future of our planet. Our future is under threat. We must solve these problems and mend these divisions together. Commos aims to co-create a collaborative knowledge graph where shared wisdom might be harnessed in ways never before seen to fight for a liveable earth and an inclusive society. We trust our technology will help us to learn, grow, share and thrive in ways that make us more human.
To make our vision a reality, we need someone like you, a Data Pipeline/ETL Engineer to design and implement an effective and far-sighted API ETL strategy for interfacing with client data sources. The role's responsibilities include:
RESPONSIBILITIES:
- Developing an end-to-end ETL strategy spanning ontology mapping, data pipeline architecture, data extraction, processing, integration, and management for all of Commos’ stakeholders
- Developing and improving the existing data acquisition strategy
- Participating in the integration of machine learning and NLP algorithms to tackle unstructured data
- Testing solutions on working data
- Writing clear, organised, highly performing code
- Defining data quality standards, and ensuring that gathered data meets these standards
- Working in close partnership with key partners and the development team.
SKILLS AND EXPERIENCE
- Degree in Mathematics, Statistics, Engineering, Computer Science or other quantitative disciplines
- Deep understanding of scientific and business processes
- Strong knowledge of REST API
- Strong knowledge of Python
- Experience implementing big data analytics best practices
- Knowledge in Machine learning and NLP is a strong asset
- Experience with NoSQL, ArangoDB or other graph databases is an asset
- Experience with AWS and Amazon Redshift is an asset
COMPENSATION:
- Equity to be discussed