DeAI Compressed

A large exchange reached out regarding Delphi’s DeAI thesis and a few other prompts. If you have missed our DeAI series, the below may be a helpful synopsis of the Thesis and core areas we are excited about.

Given crypto is essentially open source software with in-baked financial incentives – and AI is disrupting how software is written – it stands to reason AI will have a massive impact on the blockchain space across the stack.

DeAI: Challenges & Opportunities

To me, the biggest challenges facing DeAI are in the infrastructure layer given the capital intensity of building foundational models and the returns to scale in data and compute.

Capex 2.png

Given the scaling laws, Big Tech has a pronounced advantage: leveraging their colossal war chests from monopoly profits in aggregating consumer demand during the second generation of the internet and reinvesting them in cloud infrastructure during a decade of artificially low rates, the hyperscalers are now attempting to capture the market for intelligence by cornering data and compute – the key ingredients to AI:

Size Matters 3.png

Due to the capital intensity and high-bandwidth requirements of large training runs, unified superclusters are still optimal – providing Big Tech with the most performant models – closed source – which they plan to rent out at oligopoly-esque margins, reinvesting the proceeds in each subsequent generation.

However, the moats in AI have proven shallower than web2 network effects with leading frontier models depreciating rapidly relative to the field, particularly with Meta going “scorched earth” and committing tens of billions to open source frontier models like Llama 3.1 with SOTA level performance.

SOTA 4.png

This, along with emerging research in low-latency decentralized training methods, may (partially) commoditize frontier business models – shifting (at least some) of the competition from hardware super clusters (favoring Big Tech) to software innovation (marginally favoring open source / crypto) as the price of intelligence falls.

Q vs P 5.png

Given the compute efficiency of “mixture of expert” architectures and LLM synthesis / routing, it seems likely we are heading not for a world of 3 – 5 mega models, but a tapestry of millions of models with different cost / performance tradeoffs. A network of intertwined intelligence. A hivemind.

This becomes a massive coordination problem: the type which blockchains and crypto incentives should be well equipped to assist.

Core DeAI Investment Thesis

Software is eating the world. AI is eating software. And AI is basically just data and compute.

Anything which can most efficiently source the above two inputs (infrastructure), coordinate them (middleware), and meet user demands (apps), will be valuable.

Delphi is bullish on various components across the stack:

Stack 6.png

Infrastructure

Given AI is fueled by data and compute, DeAI infrastructure is dedicated to sourcing both as efficiently as possible, usually using crypto incentives. As we mentioned earlier, this is the most challenging part of the stack on which to compete, but also potentially the most rewarding given the size of the end markets.

Compute

While so far held back by latency, decentralized training protocols and GPU marketplaces hope to orchestrate latent, heterogeneous hardware to provide lower-cost, on-demand compute for those priced out of Big Tech’s integrated solutions. Players like Gensyn, Prime Intellect, and Neuromesh are pushing the frontiers of distributed training while io.net, Akash, Aethir etc are enabling lower-cost inference closer to the edge.

Market Positioning 7.png

Data

In a world of ubiquitous intelligence based on smaller, specialized models, data assets are increasingly valuable and monetizable.

Data Econ 8.png

To date, DePIN (decentralized physical networks) have largely been lauded for their ability to build out lower cost hardware networks vs. capital intensive incumbents (e.g. telco’s). However, potentially DePIN’s largest market will emerge in gathering novel data sets which flow into on-chain intelligences: agentic protocols (to be discussed later).

Data Depin 9.png

In a world where labor – the world’s largest TAM? – is being replaced by a combination of data and compute, DeAI infrastructure provides a way for non Tech Barons to Seize the Means of Production and contribute to the coming networked economy.

Middleware

DeAI’s end goal is effectively composable compute. Like DeFi money Lego’s, decentralized AI makes up for a lack of absolute performance today with permissionless composability – incentivizing an open ecosystem of software and compute primitives which compound over time to (hopefully) surpass the incumbents.

If Google is “the integrated” extreme, then DeAI represents the “modular” extreme. As Clayton Christensen reminds us, integrated approaches tend to lead in newly emerging industries by reducing friction in the value chain, but as the space matures, modularized value chains take share through greater competition and cost efficiencies within each layer of the stack:

Stack int vs m 10.png

We are quite bullish on several categories essential to enabling this modular vision:

  1. Routing

In a world of fragmenting intelligence, how can one choose the right model and the right time at the best possible price? Demand side aggregators have always captured value (see aggregation theory), and the routing function is essential to optimizing the pareto curve between performance and costs in the world of networked intelligence:

Routing 11.png

Bittensor has been the leader here in generation 1, but a host of dedicated competitors are emerging.

Allora hosts competitions between different models in various “topics” in a way that is “context aware” and self-improving over time – informing future predictions based on historical accuracy under specific conditions. Morpheus aims to become the “demand side router” for web3 use cases – essentially an “Apple intelligence” with an open source, local agent who has a user’s relevant context and can route queries efficiently through DeFi or the emerging building blocks of web3’s “composable compute” infrastructure. Agent interoperability protocols like Theoriq and Autonolas aim to push modular routing to the extreme by enabling composable, compounding ecosystems of flexible agents or components into fully-fledged on-chain services.

In short, in a world of rapidly fragmenting intelligence, supply and demand side aggregators will play an extremely powerful role. If Google became a US$2t company indexing the world’s information, then the winner in demand side routers – whether Apple or Google or a web3 solution – who indexes agentic intelligence, should be even larger.

  1. Co-Processors

Given their decentralization, blockchains are highly constrained in both data and compute. How do you bring the compute and data intensive AI applications which users will come to demand on-chain?

Co-processors!

Co processor 12.png

Source: Florin Digital

These are effectively “oracles” which offer different techniques to “verifying” the underlying data or model being used in a way which minimizes new trust assumptions on-chain while offering substantial capability increases. To date, there have been a host of projects using zkML, opML, TeeML, and Crypto economic approaches – all with varying pro’s and con’s:

Copressor 13.png

For a more in-depth review, please check out our DeAI part III report.

At a high level, co-processors are essential to making smart contracts, well… smart – providing “data warehouse” like solutions to query for more personalized on-chain experiences or providing verification that a given inference was completed correctly.

TEE networks like Super, Phala, and Marlin in particular have been rising in popularity recently given their practicality and readiness to host scaled applications today.

On the whole, co-processors are essential to merging the highly-deterministic, yet low performance blockchains with highly-performant, yet probabilistic intelligences. Without co-processors, AI would not be coming to this generation of blockchains.

  1. Developer Incentives

One of the biggest issues with open source development in AI has been the lack of incentives to make it sustainable. AI development is highly capital intensive, and the opportunity cost of both compute and AI knowledge work is very high. Without proper incentives to reward open source contributions, the space will inevitably lose to hyper-capitalist hyperscalers.

A slew of projects from Sentient to Pluralis to Sahara to Mira are all aiming to kick-start networks which properly enable and reward contributions to networked intelligences from fragmented networks of individuals.

By fixing the business model, the compounding of open source should accelerate – giving developers and AI researchers an option outside of Big Tech that is global in nature and, hopefully, also well-compensated based on value created.

While very tricky to get right and increasingly competitive, the TAM here is enormous.

  1. GNN Models

Where LLMs delineate patterns in large corpora of texts and learn to predict the next word, Graph Neural Nets (GNNs) process, analyze, and learn from graph-structured data. As on-chain data primarily consists of complex interactions between users and smart contracts – in other words, a graph – GNNs appear a logical choice to underpin AI use cases on chain.

Projects like POND and RPS are trying to build foundational models for web3 – potentially transformative in trading, Defi, and even social use cases like

  • Price predictions: on-chain behavioral models to predict prices
    • Automated trading strategies
    • Sentiment analysis
  • AI Finance: integrations into existing DeFi applications
    • Advanced yield strategies and utilization of liquidity
    • Better risk management / governance
  • On-Chain marketing
    • More tailored airdrops / targeting
    • Recommendation engines based on on-chain behavior

These models will pull quite heavily on data warehousing solutions like Space and Time, Subsquid, Covalent, and Hyperline which I’m also quite bullish on.

GNNs could prove the LLMs of blockchains and web3 data warehouses are essential enablers: providing OLAP functionality to web3.

Applications

In my opinion, on-chain agents may be the key unlock to crypto’s notoriously bad UX but more importantly, the missing demand side for the pitiful utilization of the USD billions we have poured into web3 infrastructure over the last decade.

Make no mistake, the agents are coming…

Smarts 14.png

And it seems logical these agents would leverage the open, permissionless infrastructure – across payment and composable compute – to accomplish ever more complex end goals.

In the coming economy of networked intelligence, perhaps economic flows are much less B -> B ->C and much more user -> agent -> compute network -> agent -> user.

Agentic protocols are the end result. Applications or service businesses with limited overhead which run primarily using on-chain resources meeting end user (or each other’s) demands in composable networks with much lower costs than traditional enterprises.

Just like with web2 where the application layer captured the majority of the value, I’m a fan of the “fat agentic protocols” thesis in DeAI. Value capture should shift up the stack over time.

Accrual 15.png

The next Google, Facebook, and Blackrock may likely be agentic protocols and the components to enable them are being born right now.

DeAI IV, coming out in October, will cover this in depth.

***

AI will change the shape of our economies. Today, the market expects that value capture to reside within the confines of a few large corporations in the Pacific Northwest of the United States. DeAI represents a different vision.

A vision of open, composable networks of intelligences with incentives and remuneration for even small contributions and more collective ownership / governance.

While certain narratives in DeAI get ahead of themselves, and many projects trade significantly above current traction, the size of the opportunity is large indeed. For those that are patient and discerning, DeAI’s end vision of truly composable compute may prove the very justification for blockchains themselves.

Leave your comment...

great

Thanks for the next week, or more ;(, of project specific homework. lol