We are seeking a Data Engineer for our partner, an international product technology company, to join their analytics team and enhance their data infrastructure.
This role involves building and maintaining reliable data pipelines, supporting analytics workflows, and ensuring seamless data operations. You’ll work with AWS-based infrastructure, ClickHouse, and modern ETL tools to help the team deliver accurate insights and scalable solutions.
Responsibilities * Write ad-hoc SQL reports and support analytical requests. * Add and manage users in the analytics system and database. * Solve tasks via Jira and document solutions in Confluence. * Monitor and review ETL job logs (CloudWatch, Grafana) to detect and prevent issues. * Develop AWS Lambda pipelines and integrate with REST APIs. * Build ETL pipelines using Dagster and AWS Glue.
Requirements * Strong understanding of DWH architecture. * Solid experience in writing ETL pipelines. * 2.5+ years of commercial experience in Data Engineering. * Excellent Python and SQL skills. * Familiarity with Linux environments. * Proactive and result-oriented mindset, with strong attention to detail. * Excellent communication and collaboration skills. * English: Intermediate level or higher.
Nice to have * Experience with ClickHouse DB. * Experience with Dagster. * Familiarity with gambling or betting systems.
What you’ll get * A hands-on role with real responsibility and impact on product growth. * A remote-first setup (preferably within Kyiv/EU time zone). * A dynamic and supportive team, open to knowledge-sharing and growth. * Career development opportunities: learn, experiment, and scale your expertise.