About the Role:
We are seeking a skilled Data Engineer fluent in German to design and optimize data pipelines using Azure, Databricks, Spark, and Airflow. You'll work with large datasets, develop ETL processes, and ensure data integrity and performance.
My customer are focused in the logistics and transportation industry, looking to add to their BI / Data team of 10 people currne.t
Key Responsibilities:
* Build and maintain scalable data pipelines on Azure.
* Develop ETL processes using Databricks and Spark.
* Hands-on experience with Azure Data Factory, Azure Data Lake Storage, and Azure Synapse Analytics.
* Automate workflows using Apache Airflow.
* Collaborate with cross-functional teams to understand data needs.
Requirements:
* Fluent in German.
* Experience with Azure cloud services, Databricks, Apache frameworks (Spark or Airflow)
* Strong knowledge of data modelling and ETL processes.
* Problem-solving skills and ability to work independently.