Oh, this position is no longer open. Sorry!
However, we have more positions like this which still are open.See open positions
Back to all jobs
Remote Data Engineer (Contract)
zero44 GmbHWebsite →
Remote in SQL
Remote Data Engineer (Contract)
Time zones: SBT (UTC +11), GMT (UTC +0), CET (UTC +1), EET (UTC +2), MSK (UTC +3)
Data Engineer (Contract)
About the job
We are a Berlin based B2B SaaS start-up building digital tools for the shipping industry to reduce their carbon footprint and help the industry to get to carbon zero by 2050.
As we start into our first strong growth phase, we are looking for a Data Engineer to build the zero44 data infrastructure to support business and product growth. You are someone who can see projects through from beginning to end, coach others, and self-manage. We’re looking for an eager individual who can guide our data stack using AWS services with technical knowledge, communication skills, and real-world experience.
What you’ll do
This is an initial 3-month contract, that could extend into a full-time role, with the following Role Responsibilities:
- Iterate, build, and implement our data model, data warehousing, and data integration architecture using AWS services
- Build solutions that ingest data from source and partner systems into the zero44 data infrastructure, where the data is transformed, intelligently curated and made available for consumption by downstream operational and analytical processes
- You integrate data from source systems using common ETL tools or programming languages (e.g. Ruby, Python, Scala, AWS Data Pipeline, etc.)
- You develop tailor-made strategies, concepts and solutions for the efficient handling of our growing amounts of data
- Work iteratively with our data scientist to build up fact tables (e.g. container ship movements), dimension tables (e.g. weather data), ETL processes, and build the data catalog
- Create high quality code that can effectively process large volumes of data at scale.
Technical Skills & Experience
- Experience designing, building and maintaining data architecture and warehousing using AWS services
- Experience with data integration, e.g., AWS Glue
- Experience managing AWS resources using Terraform
- Experience in Data engineering and infrastructure work for analytical and machine learning processes
- Experience with ETL tooling, migrating ETL code from one technology to another will be a benefit
- Experience with Data visualisation / dashboarding tools as QA/QC data processes
- Experience in Analytics programming languages R and Python
- Nice to have: Experience with Ruby, as the current data import infra is written in Ruby
- You can handle ambiguity, and can push projects forward even when there is not a clear best path
- You see things through from start to finish
- You can manage yourself, but you also work well with others.
- A good sense of humour and a great team player with strong team ethos.
- Fluent English and an excellent communicator, able to communicate with technical and non-technical colleagues alike
How we work
- Our weekly sprint planning call is our only regular meeting
- For complex features, write a design doc before coding
- All code goes through code review, reviews are split out evenly across the team
- We use Tuple for pair programming