As a Senior Kafka Data Engineer, you create chemistry byDesigning and implementing data ingestion strategies using SLT SAP to stream data towards our Azure Enterprise Data Lake.Maintaining and monitoring data ingestion across various platforms and systems.Designing and constructing scalable ETL/ELT pipelines to handle substantial volumes of data from multiple sources.Implementing quality control measures to ensure data accuracy and consistency.Collaborating with the Data Integration team and Business Data Owners to align infrastructure components and troubleshoot data related issues for optimal performance.Developing and maintaining technical documentation for data ingestion processes and testing new features and recommending beneficial implementations.If you have...A bachelor''s degree in computer science, Information Technology, Engineering, Business, or related fields.A Minimum of 3 4 years experience with event streaming, message brokers and other event driven architectures (Kafka, etc.) and familiarity with Big Data concepts.Experience in Agile way of working with a DevOps mindset.Hands on experience with Apache Kafka and/or Spark. Experience in using Databricks for big data analytics and processing.Proficient in managing various database technologies including Databricks, MS SQL, Oracle, MongoDB, distributed databases, time series databases, and data warehouses / data lakes.Ideally, backed up by vendor certifications (e.g. Microsoft, Linux).Excellent team player with strong interpersonal, written, and verbal communication skills.