Permanent employment - Flexibility - Team culture, Salary: €65,000 - €85,000
Area:
Berlin
Our client is based centrally in Berlin and works with manufacturing companies and factories. Their revolutionary software is changing the way factories work, offering Machine Learning, AI, IIoT and cloud-based solutions to increase efficiency.
Their solutions not only help speed up processes by making them smart, digital and eliminating errors, but also make life easier for workers, owners and customers. They combine the human aspect with AI technology.
Job description:
Your tasks:
* You work with the development and data teams to process data and transfer it internally.
* You design, build and maintain data pipelines for multidimensional datasets on various data platforms such as S3, Postgress and Kafka.
* You develop and improve the data architecture, taking care of data security and data quality.
* You use Big Data technologies and run pilot projects to develop low-latency data architectures at large scale.
* Support the automation and monitoring of existing pipelines and the onboarding of new clients.
Your profile:
* A Bachelor's degree in a technical field such as: Management Information Systems, Software Engineering
* Several years of experience with ETL, data modelling and data warehousing
* Several years of experience with processing multidimensional data sets and automating the end-to-end ETL pipeline
* Several years of experience with Python and JVM languages
* Experience with CI/CD and cloud technologies
* Experience with streaming-based systems (Kinesis/ Kafka) and event-driven design
Translated with www.DeepL.com/Translator (free version)
Your benefits: