Skip to content

Senior data engineer

  • Hybrid
    • Rotterdam, Zuid-Holland, Netherlands
    • Gent, Vlaams Gewest, Belgium
    +1 more
  • Engineering

Job description

Are you a senior data engineer who knows that great pipelines are built once, never twice, and run for years without anyone getting paged on a Sunday?

Crunch builds the data-driven companies of tomorrow. Data engineering is central to that vision, as one of the prime ways in which information from algorithms is delivered - and decisions are made. We are looking for the person who elevates this to the next level.

Too many data platforms collapse under their own weight: three orchestrators duct-taped together, four cloud accounts no one fully understands, a Slack channel full of 3am alerts, and a single engineer who is the only person who can untangle it. We think that's a waste. At Crunch, our data engineering team builds foundations that scale, stay calm under pressure, and outlive the people who built them. That means making serious architectural calls upfront, and earning the trust of clients who depend on the data every single day.

As a Senior Data Engineer, you are the person who makes that happen. You don't just build pipelines; you set the blueprint. You know which battles are worth fighting in a design review, when to reach for Iceberg instead of a classic warehouse, and when a 30-line dbt model beats a 3000-line Spark job. Most importantly, you lift the people around you: coaching mediors, challenging architects, and walking clients from "we need a data platform" to "our data platform actually works."

In your role, you'll combine an architect's foresight, an engineer's hands-on craft, and a consultant's nerve to push back when it matters.

Job requirements

🚀 What will be part of your responsibilities?

  • Lead the design of scalable data architectures that fit a customer's existing stack and their ambitions for the next five years;

  • Design and implement modern lakehouse and warehouse setups on Databricks, Microsoft Fabric, BigQuery, or Snowflake;

  • Set the standard for ETL/ELT pipelines using dbt, Airflow, Dagster, or dlt;

  • Make the call on data modeling approaches (Kimball, Data Vault 2.0, One Big Table, …) based on what actually fits the use case, not what's fashionable;

  • Embed CI/CD, infrastructure-as-code (Terraform, Bicep), and observability into every project from day one;

  • Build streaming and real-time pipelines where the use case justifies it (Kafka, Spark Structured Streaming, CDC tools like Debezium or Fivetran);

  • Set up monitoring, alerting, lineage, and data quality frameworks that catch issues before your users do;

  • Partner with data scientists, ML engineers, and BI developers to make sure the platform serves the full data chain;

  • Coach and mentor medior data engineers, raising the bar on code quality, testing, and architectural craftsmanship;

  • Act as a trusted advisor to clients, translating big ambitions into pragmatic, scalable solutions.

🚀 Required Skills

  • A proven track record designing and shipping production-grade data platforms end to end;

  • Expert command of Python and SQL, with strong opinions on what good code looks like;

  • Deep experience with at least one major cloud (Google Cloud, Azure, or AWS) and modern data platforms like Databricks, Snowflake, BigQuery, or Microsoft Fabric;

  • Hands-on expertise with dbt and modern transformation frameworks;

  • Solid grasp of orchestration tools (Airflow, Dagster, or similar);

  • Strong understanding of data modeling techniques (Kimball, Data Vault 2.0) and the judgment to pick the right one for the job;

  • Experience with CI/CD, testing, and infrastructure-as-code (Terraform or Bicep);

  • Comfortable in Unix environments and grounded in software engineering fundamentals;

  • Experience coaching, mentoring, or leading other engineers;

  • You are fluent in English.

🚀 Preferred Skills

  • Familiarity with open table formats (Apache Iceberg, Delta Lake, Hudi);

  • Experience with data observability tools (Monte Carlo, Soda, or similar);

  • Knowledge of Unity Catalog, data contracts, or data mesh principles;

  • Experience integrating LLMs and GenAI into data pipelines and platforms;

  • Familiarity with low-code solutions (Power Apps, Streamlit) for lightweight tooling;

  • Pre-sales or scoping experience in a consultancy setting;

  • Speaking Dutch is a plus.

Above all, you should have the experience to make the right architectural calls, the patience to bring clients and colleagues along with you, and the standards to refuse shipping anything you wouldn't want to maintain yourself.

🎁 Our offer

First off, you will become part of a steadily growing team of people keen on applying the latest technologies and techniques in the fields of data science, machine learning & AI. You will work in a vibrant, innovation-loving environment with the proper dash of geekiness and love for board games.

While you will be able to work both at the office or at home, your presence might also be required on-site with clients. Our offices in Ghent and Rotterdam are easy to reach, located in a colorful neighborhood of each city center, close to a vast array of after-work shenanigans.

We provide an attractive salary package stuffed with additional benefits such as an (optional) company car & fuel card, smartphone, and other gear. We help you to get around either by car, train, and/or e-bike. It's yours to choose.

We put a lot of time and effort into your personal growth and development, both as a technician and business professional. We provide dedicated feedback and coaching sessions to help you walk the career path you envisage. Expect your own talent development roadmap and the opportunity to showcase your knowledge to a broader engineering community.

On top of that, we do team trips, planned events, ad-hoc events, anything really to ensure you shine within a thriving team.

🎳 Up for the challenge?

Apply now and we'll review your application and may invite you for an initial phone interview. A job-specific challenge and presentation later, you could be the next senior data engineer to represent our team!

or