Back to jobs listings

Data Engineer

Engineering - Paris - Full-time

We believe in a world where all cars are shared. Carsharing empowers people to get going in a smarter, easier way, while also having a positive impact on the environment and making cities more liveable. It’s this vision that propels us forward and inspires us to think even bigger.

Since April 2019, Drivy is now part of Getaround. Together, we’re the world's leading carsharing platform with a community of more than 5 million users sharing over 11,000 connected cars across 7 countries.

Our team is collaborative, positive, curious, and engaged. We think fast, work smart, laugh often, and are looking for like-minded people to join us in our mission to disrupt car ownership and make cities better.

What you’ll work on

You will join the engineering team (27 brilliant people!) as a Data Engineer in our growing Data engineering squad (2 people). You’ll report to Michael and will take on many challenges.

Our team has a great impact and high-leverage: it helps the entire company be more productive and make better decisions. We love to be open about the work we do. Some examples are:

Why we've chosen Snowflake ❄️ as our Data Warehouse
Embulk

We also give back to the community by contributing to different projects like Redash, Apache Airflow, Telegraf, Embulk or many different Ruby projects.

Our main challenge this year is to make our data stack and pipelines more reliable to go from a model where data is used mainly for reporting and ad-hoc analysis to one where we can use this data in our product.

What you'll be doing

Ensure high SLAs on our core tables, making sure they have excellent freshness, reliability, and documentation.
Work closely with data analysts/data scientists and other groups to support them with various tasks and implement new ETL pipelines.
Maintaining and enhancing our growing core data infrastructure and ETL framework.
Build tools to improve company productivity with data.
Develop processes to monitor and ensure data integrity.

We’re building a marketplace that will scale to millions of users in many different countries in the coming years. You can imagine that there will also be many challenging problems that we haven’t even thought of yet!

Who you are

• You are able to write complex SQL in your sleep.
• You care about agile software processes, reliability, data quality, and responsible experimentation.
• You are pragmatically lazy. If it can be automated, it will be automated.
• You’re a team player, have excellent communication skills, understand the value of collaboration and work well within a team.
• You are pragmatic with good organization skills.
• You take satisfaction in clearing roadblocks for the team.


Our Tech Stack

Apache Airflow for ETL workflows
Embulk with various Python, Ruby and Shell scripts for ETL
Spark and EMR for large scale data processing
Snowflake as our Data warehouse
Telegraf/InfluxDB/Grafana for instrumentation and monitoring
Redash and Tableau for visualizations
Internal tool for tracking (Ruby, Javascript, Python, Redis and AWS Mobile Analytics)

Skills & Experience We Are Looking For

2+ years experience
Experience in ETL design, implementation and maintenance
• You're fluent with Python
Experience with an MPP or a columnar database
Experience with cloud-based infrastructures such as AWS or GCP.
Great communication skills, works well within a team
Able to communicate in English

We think you'll also appreciate this:

• Getting to learn from your peers and to share your knowledge on the blog and in our internal presentations
• A ticket to one technical conference of your choice each year
• Free non fiction books and access to our growing library
Headquarter in the center of Paris
• Offices in Berlin, London and Barcelona
• We often organize meetups in our office
"Hack days" a few times in the year to experiment with new technologies and ideas
• A front row seat to witness the disruption of car ownership