Opis wymagań
- At least 4 years of hands-on data engineering experience including 3 years with AWS and Snowflake.
- Strong proficiency in SQL and Python.
- Experience in implementing data pipelines and ETL workflows using AWS services such as S3, Glue, and Lambda.
- Solid understanding of data warehousing concepts and technologies.
- Experience in DBT for data transformation and modelling.
- Knowledge of data quality checks and validation processes.
- Experience using Jenkins, Docker, and Airflow for deployment, automation, and workflow management.
- Experience with other Cloud Providers (Azure, GCP) is nice to have.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills to work with cross-functional teams.
- English level - minimum B2.
- Bachelor's degree in Computer Science, Information Systems, or a related field.