Data Engineer

Deel Deel · Enterprise · Germany · S&M

Deel is seeking a Data Engineer to design, build, and maintain efficient data pipelines (ETL processes) to integrate data from various source systems into the data warehouse. The role involves developing and optimizing data warehouse schemas, writing complex SQL queries, implementing data quality measures, and collaborating with data analysts and scientists. Proficiency in Python and ETL tools like Apache Airflow is required.

What you'd actually do

  1. Design, build, and maintain efficient data pipelines (ETL processes) to integrate data from various source systems into the data warehouse.
  2. Develop and optimize data warehouse schemas and tables to support analytics and reporting needs.
  3. Write and refine complex SQL queries and use scripting (e.g., Python) to transform and aggregate large datasets.
  4. Implement data quality measures (such as validation checks and cleansing routines) to ensure data integrity and reliability.
  5. Collaborate with data analysts, data scientists, and other engineers to understand data requirements and deliver appropriate solutions.

Skills

Required

  • 3 years of experience in a data engineering or similar backend data development role
  • Strong SQL skills
  • Experience with data modeling
  • Experience building data warehouse solutions
  • Proficiency in Python for data processing and pipeline automation
  • Familiarity with ETL tools
  • Familiarity with workflow orchestration frameworks (e.g., Apache Airflow or similar)
  • Experience implementing data quality checks
  • Experience working with large-scale datasets
  • Good problem-solving abilities
  • Strong communication and teamwork skills