Job Description
1. Azure and Fabric Administration
o Configure and manage Azure subscriptions, resources, and security (Azure Admin experience).
o Provision and maintain Microsoft Fabric capacities, workspaces, and artifacts (data flows, pipelines, notebooks, lakehouses).
o Implement best practices for cost optimization, security, and performance in an Azure/Microsoft Fabric environment.
2. Data Pipelines and ETL
o Develop, maintain, and optimize ETL/ELT pipelines using Azure Data Factory (or equivalent Fabric pipelines).
o Integrate data from multiple source systems (including multiple ERPs) and ensure reliability and integrity of data consolidation efforts.
o Leverage Azure Storage (Data Lake) for staging and data transformation workflows.
3. Data Modeling and Development
o Design efficient data models and structures (relational, dimensional, and/or lakehouse concepts).
o Write clean, efficient SQL for data processing, analytics, and data wrangling tasks.
o Utilize Python to develop data ingestion and transformation scripts, including working with notebooks in Microsoft Fabric or Azure environments.
4. Collaboration and Stakeholder Management
o Work closely with business stakeholders, BI teams, and other technical leads to align on data needs and priorities.
o Provide technical expertise and knowledge transfer to team members on Azure best practices, data engineering principles, and Fabric capabilities.
o Troubleshoot and resolve data quality or pipeline performance issues quickly and effectively.
Qualifications
* Education & Experience
o Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience).
o 3-5+ years of experience in data engineering, including ETL/ELT processes and large-scale data systems.
* Technical Skills
o Azure Administration: Proficiency in provisioning and managing Azure resources (VMs, storage accounts, Azure Active Directory, etc.).
o Microsoft Fabric: Hands-on experience with data flows, pipelines, notebooks, lakehouses, or willingness to learn quickly.
o Azure Data Factory / Synapse: Designing, building, and orchestrating ETL/ELT pipelines.
o SQL: Strong ability to write complex queries, perform data analysis, and optimize queries for performance.
o Python: Experience in scripting, data manipulation, and building data workflows.
o Version Control: Familiarity with Git or similar tools for version control and collaboration.
* Soft Skills
o Strong communication skills for cross-functional collaboration.
o Ability to adapt to changing requirements and workloads.
o Analytical mindset with attention to detail and problem-solving skills.
Additional Information
In addition to a secure job in an internationally successful, fast-growing and family-friendly group of companies, you can also expect:
* Attractive remuneration package including Christmas and vacation pay
* 30 days vacation
* Hybrid working model with flexible working hours and flexitime account in a 40-hour week
* Company pension scheme
* Canteen with subsidized lunch
* Team member appreciation
* Individual training and development opportunities
* Casual dress code
* Regular employee events
* Employees recruit employees-bonus
* Corporate benefits
* Job ticket
* Free employee parking spaces
* All the traditional benefits like health insurance, paid time away while you re-energize – don’t worry, we’ve got you covered
REPA welcomes diversity and is an equal opportunities employer. All qualified applicants are considered regardless of race, religion, skin color, national origin, gender, age, sexual orientation, gender identity or disability.
Do you have questions? We will be happy to answer them! Please send an e-mail to Nadine: bewerbung.de@repagroup.com.
Have we piqued your interest? Then apply today! We look forward to receiving your online application.