
About the role:
We are looking for a result-driven Senior Data Engineer to join our core data engineering team . The successful candidate should have in-depth knowledge in data engineering with the ability to share experience with the team. The company has recently started the Data Transformation Program involving broad-scale modernization of Data Architecture, Data Governance and Data Exposition. The ultimate goal for this role is to apply best engineering practices in building end-to-end data pipelines and support transformation of the core data platform according to Data Lakehouse approach.
Responsibilities:
- Provide high quality code aligned with software engineering best practices
- Share experience with the team; encourage creating guidelines / policies
- Create data collection / ingestion / validation pipelines for various sources
- Sync with DataOps team to leverage the required CI/CD pipelines and integrations
- Work together with Data Architect to validate the target data architecture design
- Lead POC's
- Help the team to refactor the existing orchestration design in Airflow
Required skills & experience:
- Hands on experience in implementation Data Lakehouse architecture
- Advanced hands on Python skills is a must
- Airflow is a must
- Spark - nice to have
- Experience with Apache Iceberg / Delta Lake- nice to have
- Hands on experience with AWS services
- Advanced SQL
- Deep understanding of Data modelling; knowledge of medallion architecture is a plus
- Terraform - nice to have
- Good understanding of Data Mesh principles
- Snowflake experience is nice to have
- Good understanding of Data Governance & Data Quality principles




