Back
Blockchain Data Analyst

We are seeking a Blockchain Data Analyst to join our Data team. In this role, you will analyze DeFi projects, build financial metrics, and create chain-level datasets that transform normalized blockchain data into actionable insights.

EU - RemoteFull-time

About Token Terminal

Token Terminal is an institutional-grade data infrastructure and analytics provider for the blockchain economy. We extract, standardize, and deliver financial and alternative data on blockchains, decentralized applications, and tokenized assets to institutional and retail investors.

Our mission is to be the ground truth for blockchain data. We have over 120 blockchains indexed, manage over 4PB of data, and process over 200TB daily. We've implemented rigorous QA throughout the system to ensure we deliver the highest quality data. We've grown to serve over 50,000 monthly active users on our platform, while our API and data share power the most widely used TradFi and crypto platforms, including Bloomberg, Google, Binance, CoinGecko, CF Benchmarks, and Nansen. Through these partnerships, our data reaches tens of millions of users globally.

We are a remote-first company with employees distributed across the EU and US (East) time zones. We are headquartered in Finland, and our Nordic values are built into our culture: flat hierarchy, real ownership, and sustainable performance. As a small team, we're generalists who obsess over a high-quality product. We operate in a high-trust environment and default to action. We're looking for experienced engineers who can pick up new challenges and work together to solve them. We use TypeScript and SQL to build scalable systems and have a strong foundation that allows us to ship and prototype features quickly. We focus on industry-standard tools and prefer boring tech to keep our systems robust and scalable.

The Token Terminal Architecture

Our engineering team is split into three focus areas: Data ingestion and processing, analysis and serving. The Data team oversees managing the data ingestion from RPC nodes, loading and processing the data in BigQuery, and analyzing and calculating the project metrics for all projects listed. The Platform team focuses on our public API, main website: tokenterminal.com, and various integrations including our Google Sheets, Microsoft Excel and Bloomberg apps to name a few.

In all our processing, we strongly prefer ELT over ETL. Almost all data transformations on raw blockchain data are performed within our BigQuery data warehouse. This approach, coupled with the extensive use of SQL in our codebase, offers significant operational advantages. You'll encounter some elegant and innovative solutions especially within our SQL codebase.

While all of the data processing is done in BigQuery, we synchronize the resulting datasets into Clickhouse, that is the main source for API. This enables us to push our API to be faster and enable more custom and aggregations.

Over time, we've iteratively developed our proprietary blockchain data extraction framework. Our current, third-generation infrastructure is built in TypeScript and runs extraction jobs within Kubernetes.

Day-to-day responsibilities

As a member of our Data team, your responsibilities will include:

DeFi Project Analysis

  • Research and analyze decentralized applications to understand their business models, revenue streams, and key performance indicators
  • Map how DeFi protocols codify their business logic in smart contracts and how relevant data is stored onchain
  • Integrate new DeFi projects into Token Terminal using our standardized metric methodology

Chain-Level Data Development

  • Build normalized blockchain tables (e.g., for EVM chains) after raw data has been ingested into our data warehouse
  • Create price feeds, token abstractions, and other chain-level datasets that enable downstream analytics
  • Design SQL models that transform onchain data into financial and alternative metrics for projects

Cross-Functional Support

  • Support our customers—including blockchains and DeFi applications—by handling metric update requests and resolving data issues
  • Act as a power user of our datasets, assisting the Commercial team with in-depth data analysis and custom requests
  • Develop new datasets and features to support product development at Token Terminal

You will work closely with our senior engineers who handle data extraction and ingestion, building on their normalized blockchain data to deliver project-level insights and metrics.

Expectations

Within your first two months, you will work closely with our team on DeFi project integrations, learning how to analyze smart contracts and map protocol business models to our metric framework. You will begin learning the Token Terminal standard and contribute to a few project integrations.

Within six months, you will independently analyze and integrate new DeFi projects. You'll build chain-level datasets (normalized tables, price feeds) for chains already in our data warehouse. You'll become a key point of contact for select customers, helping them understand and leverage our data.

Within your first year, you will be a DeFi domain expert, confidently scaling our project coverage and contributing insights that improve our metrics methodology. You'll collaborate cross-functionally with Commercial and Product teams to shape our data offerings.

Details of the role

  • Full-time role
  • Location: Remote (EU timezones)
  • Engineering level: Mid-level

Qualifications

Must-haves:

  • Strong SQL skills
  • Understanding of DeFi protocols and how they generate revenue (DEXs, lending, staking, etc.)
  • Ability to read smart contracts and understand how business logic is codified onchain
  • Strong written and verbal communication skills (essential for remote-first collaboration)

Nice-to-haves:

  • Experience with BigQuery, Snowflake, or similar data warehouses
  • Familiarity with EVM and smart contract data structures (events, logs, storage)
  • Background in financial analysis, investment research, or crypto analytics
  • Comfort with basic scripting (Python or TypeScript)

The Token Terminal Stack

We focus on industry-standard tools and prefer boring tech to keep our systems robust, scalable.

Data Pipeline:

  • Ingestion: NodeJs, Go. Hosted in Google Cloud with k8s
  • Data processing: BigQuery SQL managed with dbt, Run via in-house tooling and Argo workflows for daily processing

API:

  • NodeJs, Postgres, Clickhouse, Redis
  • Hosted in Google Cloud with k8s

Frontend:

  • Next.js hosted in Vercel

Tools:

  • GitHub, Slack, Notion, Google Meet

Monitoring:

  • Google Cloud

Benefits

  • Competitive salary and stock options
  • We offer mentorship from industry leaders, opportunity to expand your skills and network, and the chance to collaborate with some of the brightest minds in the field.
  • Enjoy a flexible vacation policy.
  • Work with a global team across various time zones.

Interested in the Blockchain Data Analyst role?

Email our people team and we will reach out with next steps.

Reach out to our team
Subscribe to our weekly newsletter
Actionable insights you can’t get elsewhere.
© 2026 Token Terminal