Aufgaben
We’re looking for someone to join our team of data engineers, developing new data pipelines and scaling the existing ones for a growing number of customers and use cases.
* We’re using technologies like Python, Terraform, Kubernetes and Pub/Sub to run event-based data pipelines, processing millions of new data points every day on the Google Cloud Platform.
* We’re rolling out new features multiple times a day through fully automated Infrastructure as Code and CI/CD pipelines, including code reviews and automated tests.
* We take responsibility for the full DevOps cycle, but thanks to managed cloud services and automation, we spend most of our time actually working on new features and architecture optimisations, rather than responding to ops issues.
Voraussetzungen
This position allows for a remote component with occasional presence at our main location in Hamburg.
* completed degree in computer science or a related subject
* at least two years of experience in data engineering (setting up and operating data pipelines in big data/analytics environments)
* at least one year of experience with one or more of the major cloud providers (GCP/Azure/AWS)
* experience in software development with Python, knowledge of tools such as Jupyter, VS Code, PyCharm, git, virtual environments, CLI
* knowledge of automated testing, CI/CD, build pipelines, monitoring
* confident application of DevOps/SRE principles
* experience with technologies such as Kubernetes/OpenShift, Docker, SQL/NoSQL databases, messaging (Pub/Sub, Kafka, Azure Event Hubs), Airflow, Apache Beam
* knowledge of event driven architectures beneficial