Location: Berlin, Germany
My client is a tech company building Data products for over 40 countries across the world. Through outside-the-box thinking on real world problems their products help drive digital transformation for their customers. Here Data Engineering plays a central role to driving innovation, working with real world data and problems from a variety of data sets like Maps , Search , Sales and lots of Public data sets.
- Implementing and maintain production-level, robust data analytics scripts in Python in particular Pyspark scripts.
- Build out robust Data pipelines and Data infrastructure solutions using Python , SQL and the AWS suite (eg. Pyspark , Athena , Glue , Lambda etc )
- Integrate publicly available datasources, external APIs, databases and Reporting
- Work on 100s of TB's of historical and Real-time data ( using confluence )
- Min. BSc in Computer Science or a quant discipline
- Experienced in using Python within industry for at least 2 years building data pipelines.
- Experienced with data engineering tools and DBMS like MongoDB, Athena/Presto, Elasticsearch , Confluence or Snowflake
- Experienced in AWS and its suite of tools like Athena , Lambda , Glue
If your interested in hearing more feel free to reach out to me at firstname.lastname@example.org
Complete the form below to apply for the Data Engineer role: