We are looking for a Senior Data Engineer to play a key role in designing, optimizing, and improving data pipelines for the ingestion, enrichment, and exposure of classified and transactional data on AWS. Build and operate data platforms and pipelines on AWS for analytics and products.
Tasks
Responsibilities:
- Analyze and improve existing data pipelines to optimize performance, cost efficiency, and scalability.
- Transition batch/snapshot pipelines to delta-based data processing pipelines.
- Develop and maintain Terraform modules for efficient infrastructure management on AWS.
- Migrate data pipelines from Google Cloud Platform to AWS while minimizing downtime and ensuring high reliability.
- Design and implement dashboards and alerting systems to enable proactive monitoring of data pipeline performance.
- Mentor, support and guide team members, share expertise and best practices, and contribute to the overall growth of the team
Requirements
The Ideal Candidate
You bring a balance of technical strength, curiosity, and adaptability, with experience in:
- Python programming skills for data manipulation, scripting, and pipeline development.
- Experience building scalable and distributed data pipelines using PySpark.
- Strong command of SQL for querying and transforming large datasets.
- Experience with AWS services such as S3, ECS, etc., with particular emphasis on Lambda and Glue.
- Experience designing and managing infrastructure using Terraform. Knowledge of Terragrunt is a plus.
- Experience configuring and maintaining pipelines to support automation and deployment workflows.
- Knowledge of DataDog for monitoring, alerting, and dashboard creation
Benefits
Hybrid/flexible working - Onsite 3 Days a week
Romania office based near București Basarab train station.
Contract type : CIM (Individual Employment Contract) prefered, open to B2B
Annual Bonus
You will have the legal right to work and you will NOT now or in the future require sponsorship to work in Romania.