Back
Senior Data Engineer

We are seeking a Senior Data Engineer to join our Data team. In this role, you will build and maintain our blockchain data extraction pipelines, transforming raw chain data into normalized tables that power our analytics platform.

EU - RemoteFull-time

About Token Terminal

Token Terminal is an institutional-grade data infrastructure and analytics provider for the blockchain economy. We extract, standardize, and deliver financial and alternative data on blockchains, decentralized applications, and tokenized assets to institutional and retail investors.

Our mission is to be the ground truth for blockchain data. We have over 120 blockchains indexed, manage over 4PB of data, and process over 200TB daily. We've implemented rigorous QA throughout the system to ensure we deliver the highest quality data. We've grown to serve over 50,000 monthly active users on our platform, while our API and data share power the most widely used TradFi and crypto platforms, including Bloomberg, Google, Binance, CoinGecko, CF Benchmarks, and Nansen. Through these partnerships, our data reaches tens of millions of users globally.

We are a remote-first company with employees distributed across the EU and US (East) time zones. We are headquartered in Finland, and our Nordic values are built into our culture: flat hierarchy, real ownership, and sustainable performance. As a small team, we're generalists who obsess over a high-quality product. We operate in a high-trust environment and default to action. We're looking for experienced engineers who can pick up new challenges and work together to solve them. We use TypeScript and SQL to build scalable systems and have a strong foundation that allows us to ship and prototype features quickly. We focus on industry-standard tools and prefer boring tech to keep our systems robust and scalable.

The Token Terminal Architecture

Our engineering team is split into three focus areas: Data ingestion and processing, analysis and serving. The Data team oversees managing the data ingestion from RPC nodes, loading and processing the data in BigQuery, and analyzing and calculating the project metrics for all projects listed. The Platform team focuses on our public API, main website: tokenterminal.com, and various integrations including our Google Sheets, Microsoft Excel and Bloomberg apps to name a few.

In all our processing, we strongly prefer ELT over ETL. Almost all data transformations on raw blockchain data are performed within our BigQuery data warehouse. This approach, coupled with the extensive use of SQL in our codebase, offers significant operational advantages. You'll encounter some elegant and innovative solutions especially within our SQL codebase.

While all of the data processing is done in BigQuery, we synchronize the resulting datasets into Clickhouse, that is the main source for API. This enables us to push our API to be faster and enable more custom and aggregations.

Over time, we've iteratively developed our proprietary blockchain data extraction framework. Our current, third-generation infrastructure is built in TypeScript and runs extraction jobs within Kubernetes.

Day-to-day responsibilities

As a senior member working on our Data team, your responsibilities will include:

Data Extraction & Ingestion

  • Research and identify the specific blockchain data we need to extract, such as blocks, transactions, traces, and receipts
  • Integrate new blockchains into our data extraction pipelines, handling the intricacies of different node types, RPC endpoints, and data formats
  • Manage and optimize our extraction infrastructure to handle high-volume blockchain data reliably

Data Normalization & Modeling

  • Transform raw blockchain data into normalized tables (blocks, transactions, traces, receipts) within our data warehouse
  • Design and implement SQL models that standardize data across different chain architectures, enabling downstream analytics
  • Build chain-specific pipelines that account for the unique characteristics of each blockchain ecosystem

Quality & Maintenance

  • Implement rigorous QA tests to ensure the highest data quality across all integrated chains
  • Track and respond to new developments in integrated blockchains, including breaking changes, hard forks, and schema updates
  • Communicate effectively with blockchain teams to resolve data issues and ensure customer satisfaction

Expectations

Within your first two months, you will work closely with our team on blockchain data extraction, gaining a deep understanding of how data is encoded across different chain architectures (EVM, Solana, Cosmos, etc.). You will learn our extraction framework and begin contributing to pipeline improvements.

Within six months, you will have led at least one full chain integration—from node setup and data extraction through to normalized tables in our data warehouse. You will be the go-to person for that chain's data infrastructure.

Within your first year, you will be an expert in multiple chain ecosystems and play a key role in architecting our extraction pipelines. You will mentor other team members and help define best practices for new chain integrations.

Details of the role

  • Full-time role
  • Location: Remote (EU timezones)
  • Engineering level: Senior Engineer or Staff

Qualifications

Must-haves:

  • Strong experience with data warehousing and large-scale data pipelines
  • Proficiency in SQL and experience with BigQuery, Snowflake, or similar analytical databases
  • Experience with Kubernetes for managing containerized workloads
  • Comfortable with TypeScript for tooling and pipeline development
  • Strong understanding of blockchain data structures (blocks, transactions, traces, logs)

Nice-to-haves:

  • Experience running blockchain nodes or working with RPC endpoints
  • Familiarity with different chain architectures (EVM, Solana, Cosmos, etc.)
  • Experience with dbt or other SQL orchestration tools
  • Background in distributed systems or data engineering at scale

The Token Terminal Stack

We focus on industry-standard tools and prefer boring tech to keep our systems robust, scalable.

Data Pipeline:

  • Ingestion: NodeJs, Go. Hosted in Google Cloud with k8s
  • Data processing: BigQuery SQL managed with dbt, Run via in-house tooling and Argo workflows for daily processing

API:

  • NodeJs, Postgres, Clickhouse, Redis
  • Hosted in Google Cloud with k8s

Frontend:

  • Next.js hosted in Vercel

Tools:

  • GitHub, Slack, Notion, Google Meet

Monitoring:

  • Google Cloud

Benefits

  • Competitive salary and stock options
  • We offer mentorship from industry leaders, opportunity to expand your skills and network, and the chance to collaborate with some of the brightest minds in the field.
  • Enjoy a flexible vacation policy.
  • Work with a global team across various time zones.

Interested in the Senior Data Engineer role?

Email our people team and we will reach out with next steps.

Reach out to our team
Subscribe to our weekly newsletter
Actionable insights you can’t get elsewhere.
© 2026 Token Terminal