Your area of work:
Our development team at the Eschborn site is looking for reinforcement to meet the high demands on our Cloud dissemination platforms, as well as our analytics systems.
Deutsche Börse’s A7 analytics platform offers direct access to order-by-order historical and intraday market data from Eurex and Xetra, via the Cloud. Clients gain direct access to flexible analytical tools which allows them to gain advanced market insights and alter their own trading strategies accordingly.
In a highly diverse environment (from On-prem to Cloud), you will support and advance all processes of market data distribution from the conception to implementation, product test to deployment, as well as the operational business. You will be joining a team, which is highly motivated to provide the best service to the market, staying on top of advancing technology to do so.
Your responsibilities:
* Creating concepts and integration of data interfaces for processing data streams and development of data models for new products
* Adaption of various data sources (for market data) and calculation of key figures (real-time analytics) based on real-time, near-time and batch processing
* Design and development of scalable backend solution for big-data applications
* Continuous enhancement of the data platform and their interface (API)
* Automation of test cases for the existing and new functions
* Integration of the services in Google Cloud (GCP) landscape
Your profile:
* University degree in information technology, Mathematics, Physics
* Profound knowledge of computer systems, software architectures, data structures, automation, and programming
* At least 5 years of experience in software development preferably in financial area
* Good knowledge of programming languages C++ and Python (inc. Pandas, etc.)
* Fundamental knowledge of cloud technology and experience with cloud platform GCP (inc. services, like BigQuery, Dataflow or Pub/Sub)
* Knowledge of container-based solutions like Docker / Kubernetes
* Experience in data processing and tools, like Spark, Kafka, Airflow, Cassandra, Elastic Search, Beam, Flink, Presto, SQL/NoSQL, etc.
* Experience with batch and stream processing and with data formats, like Avro, Parquet, ORC or GPB
* Knowledge of devops methodology and tools (Jira, GitHub, Artifactory, Pipelines)
* Knowledge of data visualization and analysis-tools, like Tableau, Jupyter, Zeppelin
* Good language skills English. German is an advantage.