Blog

Engineering

Programming the onchain gigafactory

How we turned our business processes into code we can measure and improve

Token Terminal

Programming the onchain gigafactory

Raw blockchain data goes in. Standardized financial & usage metrics come out. We call this pipeline an onchain gigafactory, which runs a data manufacturing process across 100+ blockchains, 1,200+ protocols, and 3,000+ tokenized assets. This gigafactory turns petabytes of raw onchain data into the standardized and comparable metrics that Token Terminal serves.

Every step in the manufacturing process requires specific domain knowledge. Which objects of a raw RPC response are flattened in the data warehouse. Which field contains the data for decoding applications on a specific chain. Which smart contracts are associated with a protocol's fee generation.

We encode that knowledge into structured context that agents execute. The specifications are called skills.

From individual skills to a workflow

We started encoding each step of the gigafactory as a skill. A skill is a specification file that lives in the codebase. It tells an agent how to execute one step of the manufacturing process: the context it needs, the tools it uses, and how to verify the result.

For example, the decode-contract skill takes a smart contract address and produces SQL models that make the contract's events queryable. It handles proxy detection, ABI fetching, code generation, and cost estimation.

.claude/skills/decode-contract/SKILL.md
decode-contract
Decodes onchain events end-to-end. From contract address to generated SQL decoder models. Handles EVM (proxy detection, ABI fetching) and non-EVM (Algorand, Aptos, Cosmos).
/decode-contract uniswap arbitrum 0x1F98...F984
DiscoveryGenerationValidationCost estimation

Part of decode skill's SKILL.md. One of ~100 skills that define our manufacturing process.

An individual skill automates one step of the process. The natural next move is combining them. For example, normalizing raw data of a new blockchain into queryable tables takes four skills:

  • research-chain researches the chain's architecture
  • scrape-chain sets up the data pipeline
  • validate-chain validates the raw data against the chain's RPC
  • deploy-scraper deploys the scraper to production
Normalize raw blockchain data into queryable tables
research-chainscrape-chainvalidate-chaindeploy-scraper

A single workflow from the blockchain integration layer. Each green pill is a skill that an agent executes end-to-end.

Prior to agents, a single workflow was managed by multiple people, each owning a step based on their experience. With composable skills, we can reorganize and optimize workflows around the desired output, rather than around who historically owned each step.

Within weeks we had created nearly 100 skills across dozens of workflows. The natural next step was to formalize how they contribute to the full manufacturing process.

Context first, then build

Instead of writing skills bottom-up, we converted our business map into structured context: a directed graph of agent workflows. This graph covers every step from scraping raw blockchain data to producing the final standardized metric. We then built skills until every step had an implementation.

The graph organizes workflows into layers, each feeding the next. Three layers make up the core of the data pipeline:

  • Layer 1: Blockchain integration makes a blockchain's raw data queryable, decoded, and metric-ready. From normalizing blocks and transactions to building the cross-chain abstractions that everything downstream depends on.

  • Layer 2: Metric definition specifies what we measure and how. Revenue for a lending protocol is calculated differently than revenue for a DEX. This layer produces the methodology that all integrations follow.

  • Layer 3: App integration interprets a protocol's smart contracts and business logic. It maps contract events to standardized financial and usage metrics by using the tooling and methodology that the layers above provide.

Blockchain integration6 / 9
Normalize raw blockchain data into queryable tables
Build data freshness alerting
Decode smart contract events into readable tables
Build cross-chain abstractions for token transfers and flows
Read onchain contract state for balances and supplies
Calculate financial metrics from decoded data
Validate metrics against external benchmarks
Run end-to-end integration tests automatically
Confirm data accuracy with the project team
Metric definition5 / 7
Research the traditional finance equivalent
Map the concept to onchain mechanics
Define the target metric specification
Validate against top projects in each sector
Check if the metric follows expected distributions
Connect the definition to concrete onchain events
Generate methodology documentation automatically
App integration8 / 11
Define which metrics to build for this protocol
Write the methodology for each metric
Research the protocol's business model and fee structure
Identify which smart contracts implement the business logic
Map contract events to financial metrics
Determine how onchain data is stored and accessed
Write SQL models that extract data from decoded events
Aggregate extracted data into standardized metrics
Compare results against the target specification
Validate against external benchmarks
Review with the protocol team for accuracy
CoveredPartialMissing

The first three core layers of the gigafactory. Each row is a workflow, each dot is a coverage grade. The full map includes additional layers for serving, research, and distribution, which are not shown here.

Every workflow gets a coverage grade. Full coverage means an agent has the skills, data access, and tooling to execute it end-to-end. Partial means the agent runs the workflow but requires a human for a specific step. Missing means no skill exists yet.

The graph makes implicit knowledge explicit. App integration has the highest coverage: eight of eleven workflows running on skills. But it depends on metric definition to specify what each metric means. Gaps in one layer cascade downstream.

Measure, then improve

Skill coverage regenerates on every pull request. Every integration is also a test run. We track how long each step takes and whether the agent completes the workflow in a single pass without operator intervention. That success rate tells us whether a step is truly covered or just nominally automated.

The idea isn't new. Toyota calls it kaizen.

Kaizen (改善) is a core principle of the Toyota Production System. Every step in a production line is documented, measured, and continuously improved. The system gets better with every cycle because the process itself is designed to be rewritten.

Coverage data and success rates feed into a priority list: what to build next, ranked by how much downstream work it unblocks and how often it currently requires human effort.

Priority — What to build
P0 — build now
validate-integrationread-state
P1 — build soon
validate-partner-datadevelop-apidevelop-mcp
P2 — backlog
build-sheets·serve-queries·automate-hr·automate-finance
Extensions
integrate-chain +read-state·integrate-app +validate-integration·fill-gaps +add-ranges

Skills ranked by downstream impact and current human effort. The list updates continuously as new skills ship and integration data accumulates.

What this unlocked

Within months of mapping the gigafactory, integrating a new blockchain end-to-end on Token Terminal can be done in a single agent session. This used to take us one to two weeks before.

The speed is a byproduct. The structural wins matter more:

  • A living map of our manufacturing process. The entire business process is code we can measure and improve.
  • A priority system that tells us which skills to build next, based on what unblocks the most downstream work.
  • A review process that surfaces which existing skills need improvement, using success rate and operator interventions from every integration.

The role of our data team has shifted from executing repetitive tasks to two core responsibilities: programming business processes and assuring their quality. The time freed up goes to conversations with customers and protocol teams, deciding what to build next instead of operating what already exists.

Some workflows are already fully automated. Coverage metrics, success rates, and operator intervention logs show which workflows are ready to run autonomously, in what order to deploy them, and whether quality holds.

We are not guessing which agents to deploy. The gigafactory tells us.

The authors of this content, or members, affiliates, or stakeholders of Token Terminal may be participating or are invested in protocols or tokens mentioned herein. The foregoing statement acts as a disclosure of potential conflicts of interest and is not a recommendation to purchase or invest in any token or participate in any protocol. Token Terminal does not recommend any particular course of action in relation to any token or protocol. The content herein is meant purely for educational and informational purposes only, and should not be relied upon as financial, investment, legal, tax or any other professional or other advice. None of the content and information herein is presented to induce or to attempt to induce any reader or other person to buy, sell or hold any token or participate in any protocol or enter into, or offer to enter into, any agreement for or with a view to buying or selling any token or participating in any protocol. Statements made herein (including statements of opinion, if any) are wholly generic and not tailored to take into account the personal needs and unique circumstances of any reader or any other person. Readers are strongly urged to exercise caution and have regard to their own personal needs and circumstances before making any decision to buy or sell any token or participate in any protocol. Observations and views expressed herein may be changed by Token Terminal at any time without notice. Token Terminal accepts no liability whatsoever for any losses or liabilities arising from the use of or reliance on any of this content.

Stay in the loop

Join our mailing list to get the latest insights!

Subscribe to our weekly newsletter
Actionable insights you can’t get elsewhere.
© 2026 Token Terminal