Senior Consultant Apache Kafka & Distributed Data Systems (m/f/d)

  • Wien
  • Beon Consult
Responsibilities As Senior Consultant Apache Kafka & Distributed Data Systems you will be responsible for the design, architecture, administration, and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards, and using the latest tools and methods.You are in close contact with your customer and are responsible for the preparation, planning, migration, control, monitoring and implementation of highly scalable Apache Kafka event streaming platforms or Distributed Data Systems projects and for comprehensive customer consulting on the current state of these technologies.As a Senior Consultant for Big Data Management and Stream Processing, your goal is to lead the design and implementation of architectures for streaming platforms and stream processing use cases using open source and cloud tools. Qualifications Completed studies or comparable training with a technical backgroundSound experience and knowledge in Java Solid experience with Apache Kafka or similar large-scale enterprise distributed data systems with various distributed technologies e.g. Apache Kafka, Spark, CockroachDB, HDFS, Hive, etc.Experience in software development and automation to run big data systemsExperience with developing and implementing complex solutions for Big Data and Data Analytics applicationsExperience in system deployment and container technology with building, managing, deploying, and release managing Docker containers and container images based on Docker, OpenShift, and/or KubernetesExperience in developing resilient scalable distributed systems and microservices architectureExperience with various distributed technologies (e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.)Experience with stream processing frameworks (e.g. Kafka Streams, Spark Streaming, Flink, Storm)Experience with Continuous Integration / Continuous Delivery (CI/CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory, and Nexus.Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service DeliveryKnowledge in authentication mechanism with OAuth, knowledge of Vert.x and Spring BootExperience in SQL Azure and in AWS developmentExperience with DevOps transformation and cloud migration to one of AWS, Azure, Google Cloud Platform, and/or Hybrid/Private Cloud; as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns, and toolsExperience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus, and GraylagAbility to communicate technical ideas in a business-friendly languageInterest in modern organizational structure and an agile working environment (SCRUM)Customer-oriented and enjoy working in an international environment in German and English