Data Engineer (m/f/d) Customer 360 & Crm

Berlin, BE, DE, Germany

Job Description

As a Data Engineer for Customer 360 & CRM, you will build and maintain scalable data pipelines - real-time, near-time, and batch - connecting internal systems with our SaaS Customer Data Platform (CDP) and Customer Engagement Platform (CEP).



You’ll be responsible for ensuring accurate, reliable, and compliant data flows between systems that power personalized customer engagement and marketing automation. Your work will directly enable targeted campaigns, segmentation, analytics, and insights that drive Eventim’s data-driven growth strategy.



Working closely with Marketing, Product, Data Science, and Engineering teams, you will be the technical backbone behind our Customer 360 ecosystem - turning complex, multi-source data into structured, privacy-compliant, and actionable information.




Key Responsibilities:




Design, implement, and maintain

real-time, near-time, and batch data pipelines

connecting internal and external data sources to CDP & CEP. Implement and optimize

event streaming, API integrations, and data transformations

across cloud environments (e.g., GCP, BigQuery, Airflow, dbt). Ensure

pipeline stability, observability, and performance

across development, staging, and production environments. Guarantee

data quality, security, and compliance

, including handling of consent and GDPR requirements. Define, maintain, and document

data contracts, interface standards, and API requirements

with development teams. Develop

data models

that support campaign activation, customer segmentation, and analytics Collaborate with

CRM, Marketing, and Data Science

to align technical data pipelines with business requirements. Continuously

optimize data architecture

for scalability, reliability, and cost efficiency.

Key Requirements:




Bachelor’s or Master’s degreein Computer Science, Data Engineering, or related field (or equivalent experience) 5+ years of experience as a

Data Engineer

or in a similar role. Proven experience in building

real-time and batch data pipelines

(e.g., Kafka, Pub/Sub, Airflow). Strong proficiency in

SQL

,

Python

, and

cloud data tools

(preferably

GCP / BigQuery

). Hands-on experience with

ETL/ELT frameworks

(Airflow, dbt, Dataflow, etc.). Deep understanding of

data modeling

,

data governance

, and

API integration patterns

. Experience with

CDP / CEP platforms

(e.g., Zeotap, Tealium, Segment, Twilio, Bloomreach, Salesforce Marketing Cloud, Braze, Insider, etc.). Familiarity with

CI/CD pipelines

,

Terraform

, and

Git-based workflows

. Strong focus on

data quality, observability, and monitoring

(e.g., Datadog, Cloud Monitoring). * Understanding of

GDPR, consent frameworks

, and secure data management.

Beware of fraud agents! do not pay money to get a job

MNCJobs.de will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3747743
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Berlin, BE, DE, Germany
  • Education
    Not mentioned