• File

Максим

Data engineer

Age: 26 years
City of residence: Kyiv
Ready to work: Remote
Age:
26 years
City of residence:
Kyiv
Ready to work:
Remote

Contact information

The job seeker has entered a phone number .

Name, contacts and photo are only available to registered employers. To access the candidates' personal information, log in as an employer or sign up.

Uploaded file

Quick view version

This resume is posted as a file. The quick view option may be worse than the original resume.

Maksym Muchak
Senior / Lead Data Engineer
Europe (Remote) | [open contact info](look above in the "contact info" section) | LinkedIn | GitHub
SUMMARY
• Data engineering lead who built Argus Media's analytics platform from the ground up — a Redshift + S3 lakehouse processing ~50 GB/day across
200+ tables, now used daily by 8 analysts and data scientists to support commodity pricing and market intelligence. I lead a cross-functional
team of 8 (3 DE, 2 QA, 2 DevOps, 1 BA) and own everything from CDC ingestion to governance and cost control.
• 5+ years shipping production data systems on AWS and Snowflake across energy, telecom, banking, and entertainment. I care most about the
boring-but-critical stuff: pipeline reliability, honest cost accounting, schema contracts that actually hold, and making analysts self-sufficient so
they stop filing tickets.
EXPERIENCE
Technical Lead (Data Engineering), Argus Media — Remote (UK / EU) Dec 2024 – Present
• Own the end-to-end data platform — CDC ingestion, dbt transformation layers, Redshift + S3 lakehouse, and Lake Formation governance.
Responsible for platform reliability, data freshness SLAs, and cost optimization across all production pipelines.
• Designed blue-green CDC routing with automated cutover validation to enable zero-downtime database migrations — a hard requirement after
an earlier full-resync incident cost 3 days of analyst downtime.
• Introduced schema validation contracts at ingestion time and DLQ-based error handling; reduced mean time to detect breaking upstream
changes from ~hours (analyst complaints) to ~minutes (automated alerts).
• Made the call to keep Redshift Spectrum over a full Iceberg migration after benchmarking showed our query patterns didn't justify the complexity
— saved the team ~2 months of migration work and avoided a premature architecture bet.
Senior Big Data Engineer, Argus Media — Remote (UK / EU) Feb 2024 – Dec 2024
• Built the initial dbt layer (staging → intermediate → marts) from scratch across 5 datalake S3 buckets; standardized naming, testing, and
documentation conventions that the team still uses.
• Saved ~$100K+/year by implementing S3 lifecycle policies, Intelligent Tiering, and automated cleanup of orphaned data — work that required
auditing years of accumulated storage nobody had touched.
• Shipped monitoring and alerting (CloudWatch + SNS) for critical pipelines; SLA breaches dropped, but honestly the real win was that on-call
engineers stopped guessing which pipeline was broken.
Data Engineer, IdeaSoft.io — Kyiv, Ukraine (Remote) Nov 2023 – Feb 2024
• Joined a high-traffic entertainment platform mid-flight with zero handover — previous team was gone. Deep-dived the existing
MSSQL-to-Snowflake migration in days, stabilized it, and co-owned ongoing operations with one other engineer while on-call for hourly batch
loads.
• Chose batch ingestion (Glue → S3 → Snowflake) over Kinesis/EMR after assessing latency requirements — real-time wasn't justified, saving
significant infrastructure complexity and cost.
• Guided 10 downstream data engineers in building a medallion architecture (bronze/silver/gold) in dbt on top of the migrated data, establishing
modeling standards for the wider team.
Data Engineer, AM-BITS LLC — Kyiv, Ukraine (Remote) Apr 2021 – Oct 2023
• Delivered data pipelines across telecom, banking, and government clients as part of a 5-person team (3 DE, 1 BA, 1 QA) in an outsourcing
consultancy.
• Telecom: Built real-time ingestion into Hive for a major telco. When the war started (Feb 2022), connector stability collapsed and data volumes
spiked unpredictably — ran on-call to scale nodes and threads, rewrote late-arrival handling on the fly. This data fed the systems that helped
engineers restore mobile coverage from damaged stations.
• Banking: Built Spark + Kafka pipelines that powered cashback assignment logic — calculating spend thresholds and dynamically adjusting reward
percentages in near real-time, directly supporting customer retention and card activation revenue.
• Migrated 17+ TB from Oracle to PostgreSQL, HBase, and Hive using Apache NiFi; reduced storage costs ~20% through optimized partitioning and
compression across Cloudera/HDP.
Data Engineer, MDM-Group — Kyiv, Ukraine Nov 2019 – Jan 2021
• Built Airflow-based ETL pipelines and partitioned Hive datasets on HDFS; reduced end-to-end data latency ~20% through query tuning and
pipeline parallelization.
SKILLS
Cloud: AWS (Redshift, S3, Glue, Athena, Lambda, Step Functions, Lake Formation, CloudWatch, CloudFormation), Snowflake
Data: dbt, Apache Spark, Apache Kafka, Apache NiFi, Apache Airflow / MWAA, Fivetran
Infrastructure: Terraform, CloudFormation, Docker, CI/CD (GitHub Actions)
Languages: Python, SQL
EDUCATION
National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Master in Insurance and Sep 2020 – May 2022
Financial Mathematics — Kyiv, Ukraine
National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute", Bachelor in Mathematics — Sep 2016 – Jun 2020
Kyiv, Ukraine
CERTIFICATIONS | LANGUAGES | VOLUNTEERING
Certifications: Cloudera Technical Professional (2022–2025), The London School of English — Advanced Level
Languages: English — Full Professional | Ukrainian — Native | Russian — Full Professional
Volunteering: IT Army of Ukraine — cybersecurity initiatives; Center for Pediatric Cardiology — blood donation programs

Similar candidates

All similar candidates

Candidates at categories

Candidates by city


Compare your requirements and salary with other companies' jobs: