Back
Onchain Analytics Engineer

Build and own the dbt models that turn raw onchain data into standardized financial and usage metrics.

Remote (EU timezone)Full-time

Token Terminal's mission is to become the ground truth for onchain data. Founded in 2020, we transform raw blockchain data into standardized, comparable financial and usage metrics across 100+ blockchains, 1,200+ protocols, and 3,000+ tokenized assets. Our data pipeline manages 2+ PB of onchain data, processes approximately 200 TB, and runs ~30,000 dbt models every day.

We are a team of 20 distributed across the EU timezone.

What you'll do

  • Analyze DeFi protocols end-to-end: read smart contracts, map fee structures and token flows, and write the SQL that turns decoded onchain events into standardized financial metrics
  • Build and maintain dbt models across our four-layer data pipeline, from decoded contract data through to production metrics
  • Develop chain-level data infrastructure: normalized transaction tables, price feeds, and token transfer abstractions across 100+ blockchains
  • Investigate and resolve data issues by tracing metrics back through the pipeline to the individual block and transaction that caused them
  • Write and maintain methodology documentation. Our methodology is public, auditable, and part of the product.
  • Support commercial and product teams with data analysis, custom datasets, and integration QA

What we're looking for

  • 2+ years working with SQL in a production analytics or data engineering environment
  • Understanding of how DeFi protocols work and generate revenue. You know the difference between a swap fee, a borrow rate, and a validator reward.
  • Ability to read smart contracts well enough to understand what events mean and where value flows
  • Clear written communication. You will explain methodology to users who manage institutional portfolios.
  • Curiosity about protocols. You read docs and contracts before being told what to look at.
  • Comfortable with ambiguity in data. Onchain data is messy. There is no clean schema handed to you.
  • Track record of owning data quality problems end-to-end, not just writing queries
  • Self-directed in a small, remote team without heavy process overhead

Data stack

  • Node.js and Go for data ingestion
  • dbt and BigQuery for transformations, orchestrated with Argo and in-house tooling
  • ClickHouse for serving
  • Hosted on Google Cloud with k8s
  • GitHub, Slack, Notion

You don't need experience with everything listed. You should be excited to work directly with onchain data at scale and care about getting the numbers right.

How we hire

  1. Screening call. We tell you more about Token Terminal. You tell us more about your background.
  2. Take-home assignment. Research a real protocol's onchain mechanics, analyze the data, put together a short presentation.
  3. Walkthrough call. You walk us through your findings.
  4. Live SQL test. Coding exercise to see how you work with data in practice.

We want to understand how you think about onchain data, how you research, how you communicate, and how you work with data in practice.

What we offer

  • Competitive salary and equity
  • Employee-friendly equity terms with extended exercise windows
  • 20 days paid vacation + local public holidays
  • Remote-first, no required commute

How to apply

Send your CV and a short note to jobs@tokenterminal.xyz.

In the note, describe the hardest data problem you've worked on: what the problem was, how you solved it, and what the impact was. Keep it as lean as possible.

Subscribe to our weekly newsletter
Actionable insights you can’t get elsewhere.
© 2026 Token Terminal