DL Research Content

Building the AI-ready web3 stack: Compute, data, and intelligence

Building the AI-ready web3 stack: Compute, data, and intelligence
Credit: Client. Illustration: Andrés.

Interviewees:

Clément Fermaud, Head of Marketing at Aleph Cloud
Clément Fermaud, Head of Marketing at Aleph Cloud

Clément leads global marketing strategy at Aleph Cloud, helping drive adoption of decentralised computing solutions. He previously worked in consulting, founded a web3 media outlet, and invests in crypto start-ups.

Nick Hansen, Team Lead of The Graph Foundation
Nick Hansen, Team Lead of The Graph Foundation

Nick guides strategy and ecosystem priorities at The Graph Foundation, one of web3’s core infrastructure protocols. He also hosts the GRTiQ Podcast, highlighting founders and researchers shaping the next generation of the internet.

Michael Heinrich, Co-founder and CEO of 0G Labs
Michael Heinrich, Co-founder and CEO of 0G Labs

Michael is the co-founder and CEO of 0G Labs, a blockchain network focused on decentralised AI infrastructure. A former Bridgewater strategist and serial entrepreneur, he previously built Garten into a 650-person company and has invested in projects such as Filecoin, Uniswap, and Anthropic.

  • Artificial Intelligence is no longer just about training large models in isolated silos.
  • The next frontier is agents — systems that can think, act, and transact on their own. But in order to make that leap, they require infrastructure for real-time decisions, secure execution, and easy access to data.
  • In this Roundtable, Aleph Cloud, 0G Labs, and The Graph share how their networks are wiring together the backbone for autonomous AI with private compute, high-throughput data pipelines, and structured, queryable knowledge that agents can tap in real time.

Wiring autonomous AI

Powering autonomous agents requires collaboration between compute, data availability, and indexing layers, each playing a different role in helping agents act effectively.

“Compute, data, and indexing layers must co-evolve and interoperate to support autonomous agents,” said Aleph Cloud’s head of marketing, Clément Fermaud. “That would ensure that they can dynamically find data and compute resources without central bottlenecks, enabling them to operate securely with constant access to fresh and relevant information.”

For 0G Labs, the opportunity lies in enabling agent swarms to coordinate at scale.

“DA acts as the scalable and reliable storage foundation, compute handles verifiable inference tasks, and indexing facilitates memory and state recall,” Michael Heinrich, co-founder and CEO of 0G Labs, explained. “Together, these layers ensure agents can instantly access trusted context, perform computations locally or across GPU networks, and share results smoothly.”

The Graph, meanwhile, focuses on enabling unified workflows for agents.

“Agents need to read data, process it, and write results back seamlessly,” said Nick Hansen, team lead at The Graph Foundation. “Our role is to let agents query indexed data, act on it, and contribute insights into datasets that remain consistent across applications.”

The trust layer

As AI-native systems evolve, security and performance become equally critical for scaling agent ecosystems.

“Confidential computing is needed because protecting data, models, and logic is fundamental,” said Fermaud. “AI agents handle highly sensitive inputs, so in a decentralised network of untrusted nodes, TEEs are essential to keep those secrets during processing. Hardware enclaves isolate code and data, ensuring private inputs or proprietary logic are never exposed to the node operator.”

0G Labs has tackled the performance bottleneck.

“AI swarms generate and exchange huge traces of models, inputs, and outputs,” Heinrich explained. “Traditional blockchains can’t handle GB/s-scale streaming. Our architecture separates publishing and storage lanes, uses GPU-accelerated erasure coding, and supports multi-consensus shards for infinite horizontal scalability, letting agents fetch and confirm data with near real-time latency.”

For The Graph, trust comes from verifiable and structured data.

“AI agents need more than just raw data. They need reliable, structured, and context-rich datasets,” said Hansen. “By indexing blockchain information into deterministic Subgraphs and Substreams, we provide datasets that allow agents to make reproducible, verifiable decisions instead of relying on opaque APIs.”

Data markets, incentives, and composability

Building sustainable AI ecosystems requires aligning incentives for networks, node operators, and agents.

The Graph is expanding its infrastructure with Hypergraph, which allows agents to maintain private or shared composable data spaces.

“It’s the difference between reading from a library and writing your own encyclopedia,” Hansen explained. “Hypergraph lets agents blend public blockchain data with proprietary insights and contribute back to shared knowledge in a permissioned way.”

For 0G Labs, scaling relies on rewarding reliability. “Proof of Random Access incentivises nodes for serving data quickly and accurately,” said Heinrich. “We focus on positive reinforcement rather than punishment, rewarding bandwidth, speed, and low-latency performance.”

Aleph Cloud is making privacy-preserving compute more accessible.

“High-performance confidential computing shouldn’t be an enterprise-only capability,” said Fermaud. “Our pay-as-you-go architecture and modular resource options make secure compute affordable for startups, open-source teams, and smaller AI projects.”

The road ahead

As autonomous AI ecosystems grow, interoperability and open standards will shape their scalability.

0G Labs sees the biggest scaling pressure coming from interactions between agents rather than model size.

“A million lightweight agents coordinating in swarms creates higher data throughput demands than a single large model,” Heinrich explained. “Our focus is on enabling modular scalability through multi-consensus DA and GPU acceleration.”

The Graph is working on standards to help agents share and build knowledge collaboratively. “With GRC-20, we’ve introduced schemas that give agents native identities and composable data spaces,” said Hansen. “This enables a world where agents don’t just consume data but contribute insights, annotations, and new knowledge into shared evolving graphs.”

For Aleph Cloud, the future lies in combining different approaches to trust and privacy.

“TEEs are ideal for confidentiality and speed today, but zero-knowledge proofs and fully homomorphic encryption will play bigger roles over time,” said Fermaud. “We envision hybrid systems where agents run inside enclaves for efficiency, while publishing proofs externally for verifiability.”

Foundations for the future

As autonomous agents move from concept to reality, the challenge lies in balancing scale with sustainability. Each layer of the AI stack is adapting to ensure these systems can thrive in the long term.

“At Aleph Cloud, we address sustainability on both ecological and economic fronts,” said Fermaud. “Dynamic resource allocation reduces energy waste, and our pay-as-you-go model makes confidential computing accessible far beyond large enterprises.”

For Heinrich, sustainable growth comes down to incentives. “Proof of Random Access and restaking mechanisms reward nodes for speed and reliability,” he said. “By shifting from punishment to positive reinforcement, we create the conditions for a resilient, global data availability layer.”

Hansen pointed to how demand itself will evolve. “In an AI-native world, query traffic may come from thousands of small agents rather than a handful of big apps,” he said. “That means our incentive structures must adapt so indexers can continue to prioritise the most useful, high-frequency datasets.”