You will have responsibility for the maintenance and progression of our data lake and data pipelines to deliver reliable clean data. You will collaborate with various teams to deliver data and provide them with the framework and tools to gain insights from data efficiently.
Your day to day work includes:
- Developing and extending our data lake and pipelines
- Being a valued member of an agile team of data engineers
- Supporting multiple teams with your data engineering expertise
- Sharing your knowledge and learning from others, mainly by reviewing each other's code
The client are looking for engineers who are enthusiastic about data and creating and developing infrastructure. You are a proactive team player who will collaborate with others and are passionate about writing clean code and applying software engineering best practices (testing, CI/CD).
Ideally you will have the following skills:
- Proficient in Python
- Hands-on experience on the AWS platform and exposure to AWS data services
- Experience in SQL
- Understanding of ETL pipelines, ideally with Apache Spark
- Familiarity with container technologies
Please make sure that you have a valid work permit for either Switzerland or the EU.
- Flexible working hours and home office possibilities
- Personal development plan, yearly budget for educational courses, conferences etc.
- On-site canteen with a selection of fresh food and healthy options
- Discounts in many surrounding shops and partner vendors like HP, Allianz, Sunrise, Starbucks, Shell and more
- Lots of team activities and regular hackathons