Looking for Middle+/Senior Data Engineer to develop a cloud DWH on AWS: building and maintaining reliable ETL/ELT pipelines (S3 + Redshift) using Airflow and DBT, improving data quality, and collaborating with analytics and product teams.
Requirements
• Excellent SQL (window functions, optimization for analytical workloads).
• AWS experience: at least S3 and Redshift (or any other cloud data warehouse).
• Hands-on experience building ETL/ELT pipelines and orchestrating them in Airflow.
• DBT experience: models, tests, documentation, Git workflow.
• Python for data pipelines.
• Understanding of data modeling approaches and DWH architecture (we use Data Vault 2.0).
• Git workflow skills (code review, pull/merge requests).
• English from B2 level.
Nice-to-have:
• CI/CD setup experience for data projects (DBT, tests, pipeline deployment).
• Experience with Kafka/Kinesis or other streaming data sources.
• Experience with BI tools (Looker, Power BI, Tableau, etc.).
Benefits
• Remote work from anywhere in the world
• Performance bonus, paid vacations and day-offs, personal development plan
• Transparent processes, short dailies, regular retros and 1-on-1s. We prioritize work over endless calls.
• Minimal documentation — only critical docs, analysts handle the rest.
• Fast deploys and minimal approvals. See your work's impact quickly.
• All development locally, no RDP.
• Influence on DWH architecture. Propose and discuss your ideas — we listen and act.
• Diverse, inclusive team where all voices are valued, and professional growth is a priority.