Responsibilities
1. As Senior Consultant Apache Kafka & Distributed Data Systems you will be responsible for the design, architecture, administration, and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards, and using the latest tools and methods.
2. You are in close contact with your customer and are responsible for the preparation, planning, migration, control, monitoring and implementation of highly scalable Apache Kafka event streaming platforms or Distributed Data Systems projects and for comprehensive customer consulting on the current state of these technologies.
3. As a Senior Consultant for Big Data Management and Stream Processing, your goal is to lead the design and implementation of architectures for streaming platforms and stream processing use cases using open source and cloud tools.
Qualifications
4. Completed studies or comparable training with a technical background
5. Sound experience and knowledge in Java
6. Solid experience with Apache Kafka or similar large-scale enterprise distributed data systems with various distributed technologies e.g. Apache Kafka, Spark, CockroachDB, HDFS, Hive, etc.
7. Experience in software development and automation to run big data systems
8. Experience with developing and implementing complex solutions for Big Data and Data Analytics applications
9. Experience in system deployment and container technology with building, managing, deploying, and release managing Docker containers and container images based on Docker, OpenShift, and/or Kubernetes
10. Experience in developing resilient scalable distributed systems and microservices architecture
11. Experience with various distributed technologies (e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.)
12. Experience with stream processing frameworks (e.g. Kafka Streams, Spark Streaming, Flink, Storm)
13. Experience with Continuous Integration / Continuous Delivery (CI/CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory, and Nexus.
14. Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
15. Knowledge in authentication mechanism with OAuth, knowledge of Vert.x and Spring Boot
16. Experience in SQL Azure and in AWS development
17. Experience with DevOps transformation and cloud migration to one of AWS, Azure, Google Cloud Platform, and/or Hybrid/Private Cloud; as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns, and tools
18. Experience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus, and Graylag
19. Ability to communicate technical ideas in a business-friendly language
20. Interest in modern organizational structure and an agile working environment (SCRUM)
21. Customer-oriented and enjoy working in an international environment in German and English