• File

Андрій

Програміст баз даних

Considering positions:
Програміст баз даних, Data engineer, Big data engineer
Age:
18 years
City of residence:
Kyiv
Ready to work:
Remote

Contact information

The job seeker has entered a phone number .

Name, contacts and photo are only available to registered employers. To access the candidates' personal information, log in as an employer or sign up.

Uploaded file

Quick view version

This resume is posted as a file. The quick view option may be worse than the original resume.

Андрій Шарагін
SUMMARY
Motivated Junior Data Engineer with a solid grasp of Python, SQL, and the Modern Data Stack. Proven ability to build scalable end-to-end data pipelines, recently completing a Real-time Weather Analytics Platform and a Crypto Analytics solution.
Expertise in implementing Medallion Architecture using Airflow, Spark Streaming, and Kafka to handle real-time data ingestion and processing. Proficient in managing hybrid storage solutions, including S3-compatible Data Lakes (MinIO) with Parquet partitioning and PostgreSQL Data Marts.
Experienced in containerizing complex multi-service environments with Docker, ensuring 100% automated data flows from external APIs to Grafana dashboards. Passionate about data integrity, idempotency, and building robust ETL/ELT solutions that deliver actionable business insights.

EXPERIENCE

Dec 2025 — Jan 2026
Junior Data Engineer (Project Based)
Self-employed / Portfolio Project, Kyiv

Developed a full-fledged data platform that mimics a real e-commerce infrastructure. The project demonstrates
building an automated ELT pipeline with a focus on data
quality and containerization.

Key Achievements:

Layered Architecture: Implemented a multi-layered data
model in PostgreSQL for structured transformation of raw
logs into business metrics.
Orchestration: Configured Apache Airflow to manage
dependencies between CSV file uploads and
transformations.
Data Quality: Implemented automated testing via dbt
(schema & singular tests) to ensure data integrity (unique,
not_null, business logic).
Infrastructure as Code: Fully deployed the stack
(PostgreSQL, Airflow, dbt, Metabase) using Docker
Compose for rapid environment replication.
Business Intelligence: Created dashboards in Metabase
to analyze user behavior and top customers.

Feb 2026 — Feb 2026
Junior Data Engineer (Personal Project / Self-employed)
Crypto Data Pipeline Project, Kyiv

Developed and implemented an automated system for real-time crypto market data collection and analysis. The project
demonstrates the full data lifecycle: from collection from
API to visualization for business.

Key technical solutions:

ETL & Orchestration: Automating data collection from
Binance API using Python and Apache Airflow.
Data Warehouse: Configured storage on PostgreSQL and
MinIO, implemented transformations via dbt.
Infrastructure: Fully containerized the project via Docker
Compose for rapid deployment.
Analytics & Alerting: Created dashboards in Streamlit
and integrated a Telegram bot for volatility alerts.

Feb 2026 — Mar 2026
Junior Data Engineer (Personal Project / Self-employed) Real-time Weather Analytics Stack, Kyiv

Developed a load, real-time data streaming platform to process global weather data. The project focuses on handling high-velocity data using the modern Big Data stack and hybrid storage architecture.

Key Technical Solutions:

Streaming & Message Broker: Orchestrated real-time data ingestion from OpenWeather API using Apache Kafka, ensuring decoupled and reliable message delivery.
Big Data Processing: Implemented Spark Streaming (PySpark) for real-time data transformation, aggregation, and analytical reporting.
Hybrid Storage Architecture: Designed a dual-storage system using MinIO (Data Lake) for historical Parquet-formatted data and PostgreSQL (Data Mart) for high-performance serving.
Medallion Architecture: Applied Bronze, Silver, and Gold layers to maintain data lineage and high quality of weather metrics.
Visualization: Configured Grafana dashboards connected to the PostgreSQL Data Mart to visualize real-time temperature trends and city analytics.
Infrastructure: Fully containerized a complex 8-service environment (Spark, Kafka, Airflow, MinIO, etc.) using Docker Compose.

EDUCATION
2023 — 2027
Associate Degree in Data Structures, Algorithms, and
Databases
Professional College of Information Systems and Technologies of KNEU, Kyiv
Technologies of KNEU, Kyiv

Similar candidates

All similar candidates

Candidates at categories

Candidates by city


Compare your requirements and salary with other companies' jobs: