Gemma Analytics
Gemma Analytics
New

(Senior) Data & Analytics Engineer

Berlin, Germany (hybrid)
Employee
Engineering

We are Gemma Analytics: a Berlin-based company specializing in generating insights in high-performance data infrastructure. Gemma was founded in early 2020 by two data enthusiasts. Ever since, we have helped over 70 companies to become more data-driven and successful. We have a fun, honest, and inclusive work environment. We are always looking for data-minded people we can learn from.

Tasks

Gemma Analytics is data-driven and helps clients to become more data-driven.

As our Senior Data & Analytics Engineers, you play a critical role in helping our clients unlock business value from their data. You’re not just technically strong — you’re a Data Magician who uncovers structure in chaos and turns raw data into meaningful, actionable insight. You dig into complex datasets, spot what others overlook, and guide clients toward pragmatic, high-impact solutions.

But your impact doesn’t stop at client work. As a senior team member, you act as a sparring partner and coach to your colleagues. You’re someone others turn to for advice on technical challenges, project structure, and best practices — and you’re excited to help them grow.

You have the opportunity to work on difficult problems while helping startups and SMEs to make well-informed decisions based on data.Challenges:

  • As we are tooling-agnostic, you will touch on multiple technologies and understand the in’s & out’s of what is currently possible in the data landscape
  • Collaborate with domain experts and client stakeholders to solve data challenges across a variety of industries
  • Support and mentor other team members through code reviews, pair programming, and knowledge sharing
  • Lead internal sparring sessions and contribute to developing team-wide best practices and scalable project structures

Technologies you’ll use

Working with multiple clients, we are in touch with many technologies, which is truly exciting. We use state-of-the-art technologies while being fully pragmatic (we do not crack a walnut with a sledgehammer). We follow an ELT philosophy and divide the tasks between Data Engineering and Analytics Engineering accordingly.

The following technologies constitute our preferred data tech stack:

Data Loading

  • For our clients, we either use a scheduler (e.g. Apache Airflow or Prefect) and run Python DAGs with it - we also like to work with dlt as a framework
  • For standard connectors, we work with Fivetran or Airbyte Cloud preferably

Data Warehousing

  • For smaller data loads, we mostly use PostgreSQL databases
  • For larger datasets, we mostly work with Snowflake or BigQuery

Data Transformation

  • We love to use dbt (data build tool) since 2018 - we can also work without it, yet we are fans to be honest
  • It is important to us that we work version-controlled, peer-reviewed, with data testing, and other engineering best practices

Data Visualization

  • For smaller businesses with < 100 FTE, we mostly recommend Metabase or Superset as a powerful open-source reporting tool
  • For specified needs and a centralized BI, we recommend PowerBI or Tableau
  • For a decentralized, self-service BI with more than 50 users, we recommend Looker, Holistics, or ThoughtSpot
  • We are always on the lookout for new tools, at the moment we are excited about Lightdash, Omni, dlt, and other tools

Requirements

We believe in a good mixture of experience and upside in our team. We are looking for both types of people equally - for this role, we require more expertise and proof of trajectory.

Besides that, we are looking for the following:

  • 3–4 years of hands-on experience in data engineering or analytics engineering, with a strong focus on building and maintaining robust data pipelines and analytics-ready data models
  • Proficient in SQL and experienced with relational databases, capable of translating complex business logic into clear, maintainable queries
  • Hands-on experience using dbt (preferably dbt Cloud) in production environments, following best practices for modular, testable, and documented code
  • Solid understanding of data modeling techniques (e.g., Kimball dimensional modeling, Data Vault, star/snowflake schema) and data warehousing principles
  • Experience working with modern data stack tools, such as Snowflake, BigQuery, Airflow, Airbyte/Fivetran, Git, and CI/CD workflows
  • Proficient in Python (or a similar scripting language) for use cases such as API integration, data loading, and automation
  • Strong communication skills in English (written and spoken), with the ability to explain technical decisions and collaborate with both technical and non-technical stakeholders
  • Comfortable working in client-facing projects, navigating ambiguity, and delivering high-quality results with minimal oversight
  • Experience coaching or mentoring junior team members through code reviews, sparring, and knowledge sharing
  • Bonus: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker) to support end-to-end workflows or assist analysts
  • Bonus: Fluency in German

Benefits

We are located in Berlin, close to Nordbahnhof. We are currently 20 colleagues and will grow to 22 colleagues this year. Other perks include:

  • We are a hybrid company that meets in the office twice a week - one common office day and one flexible day
  • We allow for intra-EU workcations for up to 3 months a year (extra-EU workcations also if this is allowed)
  • We have an honest, inclusive work environment and want to nurture this environment
  • We don’t compromise on equipment - a powerful Laptop, extra screens, and all the tools you need to be effective
  • We will surround you with great people who love to solve (mostly data) riddles
  • We believe in efficient working hours rather than long working hours - we focus on the output rather than the input
  • We learn and share during meetups, lunch & learn sessions and are open to further initiatives
  • We pay a market-friendly salary and we additionally distribute at least 20% of profits to our employees
  • We are fast-growing and have technology at our core, yet we do not rely on a VC and operate profitably
  • We have a great yearly offsite event that brings us all together for a full week, enjoying good food, and having a good time (2021: Austria, 2022: Czech Republic, 2023+2024: Germany, 2025: Spain)

How you’ll get here

  1. CV Screening
  2. Phone/Coffee/Tea Initial Conversation
  3. Hiring Test @home
  4. Interviews with 2-3 future colleagues
  5. Reference calls
  6. Offer + Hired

Looking forward to your application :)

Updated: 2 minutes ago
Job ID: 14001467
Report issue

Gemma Analytics

11-50 employees
Technology, Information and Internet

We activate data for our clients by using state of the art technology. We help our clients make better decisions. We like solving complex challenges.

  1. (Senior) Data & Analytics Engineer