As Principal Consultant Kafka you will be responsible for the design, architecture, administration, and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards, and using the latest tools and methods.
You and your team are in close contact with your customer and are responsible for the preparation, planning, migration, control, monitoring, and implementation of highly scalable event streaming platforms or Kafka projects and for comprehensive customer consulting on the current state of these technologies.
As a Principal Consultant for Big Data Management and Stream Processing with a team of Kafka and DevOps engineers and developers, your goal is to lead the design and implementation of architectures for streaming platforms and stream processing use cases using open source and cloud tools.
You will lead a team of highly skilled Kafka Engineers. In addition to technical leadership responsibility and resource management, you will be responsible for the targeted personal, methodological and technical staff development of the team.
Your team of developers and operators will build, install, configure, test, deploy and monitor the latest enterprise streaming platforms and their complex DevOps solutions in a fast, high-quality, and controlled manner.
Completed studies or comparable training with a technical background
Sound experience and knowledge in Java
Solid experience with Kafka or similar large-scale distributed data systems
Experience in software development and automation to run big data systems
Experience with developing and implementing complex solutions for Big Data and Data Analytics applications
Experience in system deployment and container technology with building, managing, deploying, and release managing Docker containers and container images based on Docker, OpenShift, and / or Kubernetes
Experience in developing resilient scalable distributed systems and microservices architecture
Experience with various distributed technologies (e.g. Kafka , Spark, CockroachDB, HDFS, Hive, etc.)
Experience with stream processing frameworks (e.g. Kafka Streams , Spark Streaming, Flink, Storm)
Experience with Continuous Integration / Continuous Delivery (CI / CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory, and Nexus.
Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
Good knowledge and experience in authentication mechanism with OAuth, knowledge of Vert.x and Spring B oot
Experience in SQL Azure and in AWS development
Experience with DevOps transformation and cloud migration to one of AWS, Azure, Google Cloud Platform, and / or Hybrid / Private Cloud;
as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns, and tools
Experience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus, and Graylag
With consulting experience , propose competitive and innovative technical solutions that gain client approval and reliably implement agreements
Ability to communicate technical ideas in a business-friendly language
Interest in modern organizational structure and an agile working environment (SCRUM)
Experience in managing experts and projects, such as Statement of Work (SOW) negotiations, project initiation, client management, project execution, and implementation
Customer-oriented and enjoy working in an international environment in German and English