984
Schwarz IT takes care of the entire digital infrastructure and all software solutions of the companies of Schwarz Group. As a result, it is responsible for the selection, provision, operation and continuing development of IT infrastructure, IT platforms and business applications. In order to provide IT solutions that optimally support the departments’ business processes, Schwarz IT addresses the departments’ requirements in consultations and works with them to develop professional and effective IT solutions.
What you´ll doWork in a cross-functional product team to design and implement data centered features for Europe’s largest Ad Network
Help to scale our data stores, data pipelines and ETLs handling terabytes of one of the largest retail companies
Design and implement efficient data processing workflows
Extend our reporting platform for external customers and internal stakeholders to measure advertising performance
Continue to develop our custom data processing pipeline and continuously search for ways to improve our technology stack along our increasing scale
Work with machine learning engineers and software engineers, to build and integrate fully automated and scalable reporting, targeting and ML solutions
You will work in a fully remote setup but you will meet your colleagues in person in the company and engineering specific onsite events
What you’ll bring along3+ years of professional experience working on data-intensive applications
Fluency with Python and good knowledge of SQL
Experience with developing scalable data pipelines with Apache Spark
Good understanding of efficient algorithms and know-how to analyze them
Curiosity about how databases and other data processing tools work internally
Familiarity with git
Ability to write testable and maintainable code that scales
Excellent communication skills and a team-player attitude
Great if you also haveExperience with Kubernetes
Experience with Google Cloud Platform
Experience with Snowflake, Big Query, Databricks and DataProc
Knowledge of columnar databases and file formats like Apache Parquet
Knowledge of "Big Data" technologies like Delta Lake
Experience with workflow management solutions like Apache Airflow
Affinity for Data Science tasks to prototype Reporting and ML solutions
Knowledge of Dataflow / Apache Beam
Unsere Benefits
Dein Ansprechpartner
###
Lara Schlimgen
RecruiterinE-Mail:
recruiting@mail.schwarz
Beware of fraud agents! do not pay money to get a job
MNCJobs.de will not be responsible for any payment made to a third-party. All Terms of Use are applicable.