Data Engineer for Data Integration and Big Data
vor 2 Tg.

Your main responsibilities

  • Designs, develops, maintains and tests Data Ingestion solutions bottom-up from source (e.g. legacy / mainframe / near real time) to target (files / interfaces / Data marts) using Talend Data Fabric, NiFi,
  • Implements Change Data Capture pipelines enabling UNIQA using real time data feeds (e.g. Kafka)
  • Supports the team in building-up large-scale data processing systems and complex big data projects
  • Builds data processing systems in Hadoop, Spark and Hive (Cloudera Data Platform)
  • Understands how to apply technologies to solve big data problems and contributes to the design of Enterprise Data Use Cases
  • Focuses on collecting, parsing, managing, analyzing and visualizing large sets of data coming from heterogeneous domains
  • Contributes to Data Governance in terms of enabling Data Lineage, Data Cataloging and Data Modeling
  • Works in a highly motivated interdisciplinary team with different Business stakeholders, Architects, Data Engineers, and Data Scientists from both Business and IT
  • Your qualification

  • 3 years minimum Java Development in Enterprise-grade environments as a must
  • High proficiency in cutting-edge Data Integration & ETL (e.g. Talend Data Fabric, NiFi), as well as Data Replication and Messaging Broker (e.
  • g. Kafka) tools as a must

  • High proficiency with Big Data Hadoop / Spark / Hive / HBase / Impala / Kudu ecosystems like Cloudera / Hortonworks as a must
  • Experience with CICD, IaC and DevOps toolstack
  • Experience with setting up applications using Cloud Infrastructure Providers
  • FinTech / Insurance Know-how as a plus
  • Knowledge in different programming or scripting languages like SQL (ANSI and Dialects), Python, R, Perl, Ruby as a plus
  • Excellent oral and written communication skills, in German and English (Slovak, Bulgarian, Hungarian as a plus)
  • Excellent team player, able to work in a problem-solving agile environment
  • Experience with BI & Analytics Platforms, Reporting (eg. SAS Analytics Platform / Viya / etc..., Information Builder WebFOCUS, Power BI preferred
  • Knowledge in tools for Data Governance and Data Cataloging (e.g Apache Atlas / Cloudera Navigator) preferred
  • Your core competencies

  • Perseverance : single-mindedly and persistently pursuing your assignments
  • Drive for Results : achieving your goals dependably and timely
  • Action Oriented : being full of energy and seizing opportunities
  • Organizing : using resources effectively and efficiently
  • Learning on the Fly : learning quickly and having a focus on solutions
  • Annual minimum wage according to collective agreement : EUR 43.946 - gross. We are prepared to exceed depending on qualification and experience.

    Melde diesen Job

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Klicke auf "Fortfahren", um unseren Datenschutz-und Nutzungsbestimmungen zuzustimmen . Du kriegst außerdem die besten Jobs als E-Mail-Alert. Los geht's!