LVC Series: Part 3 - Process & Market Segmentation

February 21, 2024

Disclaimer: This is not financial advice. Anything stated in this article is for informational purposes only and should not be relied upon as a basis for investment decisions. Triton may maintain positions in any of the assets or projects discussed on this website.

This is a multi-part series on Triton’s investment process for liquid crypto:

Liquid Crypto Market Segmentation

As discussed in our last post, liquid venture capital provides a more flexible investment paradigm for the emerging crypto asset class and combines venture-style returns with differentiated risk and a shorter commitment.

Our goal as liquid VC investors is to take as an input the total investable landscape of crypto assets in the space and produce as an output the best possible diversified portfolio of high quality, early-stage crypto projects and protocols. Over the next few posts, we will discuss how we approach this process.

Our process consists of three distinct phases: 

  • Research (purple)
  • Data Due Diligence (blue)
  • Investment (green)

In order to accomplish our end goal, we must first start by segmenting the market into categories with similar characteristics.

Currently there are roughly 23,000 unique crypto assets, the vast majority of which are entirely worthless. To reduce this total investable landscape, we first apply a coarse filter that truncates the list based on quantitative metrics such as market cap, FDV, token issuance, trading volume, and token distribution. We then qualitatively backfill this list with compelling early-stage tokens that might have been preemptively removed by our coarse quantitative filter.

We then segment the token landscape into distinct verticals, around which we have an internal thesis. For example, if we are interested in exposure to DeFi DEXs, we will aggregate projects such as Uniswap, Curve, Sushiswap, Orca, Pancakeswap, and Balancer, amongst others. This delineation accomplishes a few goals:

  • It lets us look at a vertical in aggregate, and then select the most compelling projects in each vertical
  • It provides the competitive landscape against which we can track growth of specific projects, intra-vertical
  • It provides the basis of our regression analysis and data dashboarding (more on this in future posts)

A major limitation with running simple regression models for each vertical is the limited histories of most projects or of the vertical itself. Often projects are less than one year old, so attempting to run regression for 19 unique verticals is too fine-grained to identify meaningful vertical-specific independent variables that impact prices. As such, we simply define a higher-order grouping for the categories we invest in, using the following nomenclature:

  • Ecosystem Tokens - Layer 0, Layer 1, and Layer 2 assets that accrue value by selling blockspace.
  • App Tokens - Project specific tokens that accrue value proportional to cash flows, buy and burn mechanism, use, or governance.

The first (and most important) task in any regression pipeline is producing a clean and expressive dataset. From Token Terminal, we retrieved a dataset containing price and 35 other fundamental variables (e.g. active developers, earnings, treasury holdings, etc.). We then merged this dataset with a set of 9 macro variables for each day, including the employment rate, GDP growth forecast, S&P 500, and US Treasury holdings. After processing this data, we had a rich set of over 150,000 rows of financial data to mine.

As shown in the data availability plot below for ecosystem tokens, this dataset extends as far back as 2013 with the inclusion of Bitcoin prices. More recent ecosystem tokens such as Sui and Arbitrum are present as well, with much less data. Bitcoin and Ethereum have the largest representation in this dataset, but other ecosystem tokens such as Fantom, Loopring, and Algorand help add diversity and robustness to our models.


As shown below, the daily returns we find in the dataset vary wildly, with a mean of 0.35% and a standard deviation of 6.8%. For comparison, the S&P 500 day-over-day has a mean return of 0.0327% and standard deviation of 0.975%. This demonstrates the extreme volatility present in the emerging crypto asset class.

Within the ecosystem tokens, we performed a correlation analysis of our daily variables. In this correlation, we find many interesting patterns. As one would expect, we find that the number of active developers strongly correlates with the number of code commits and contract deployers. 

Additionally, as one would expect, the US employment rate correlates positively with the total crypto market cap as well as other macro variables such as household assets, implying that economic conditions that engender excess disposable income will likely result in additional allocation towards assets like crypto. Given crypto represents the long-tail of risk-on assets, this confirms intuitive reasoning.

Some counterintuitive trends are present, but upon closer inspection we can begin to understand them. For example, the large streak of negative correlations for earnings is due to Token Terminal's definition of earnings, which counts token issuance as a negative; thus, ecosystem tokens are often likely to have negative earnings due to high token issuance. 

We also find that the S&P500 correlates positively with token prices and total crypto market cap, which is due to general macro risk-on vs. risk-off sentiment.

The next post in this series will turn our focus to the feature importance differences between ecosystem tokens and appchains, which serves as the basis of our data dashboarding and portfolio tracking process.

Triton Welcomes Peter Knez, Former CIO of BlackRock
Dec 2, 2024

He brings a wealth of knowledge from BlackRock, BGI, Goldman Sachs and more.

Off to the races?
Nov 14, 2024

What a Trump presidency means for digital assets

What’s the deal with Polymarket?
Nov 5, 2024

Polymarket and the 2024 US election season

Subscribe to our NewsLetter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.