
ZK proofs seen key to DePIN AI growth
Goldman Sachs said a baseline forecast for artificial intelligence infrastructure spending could reach $7.6 trillion, although the final cost will depend heavily on factors including how quickly AI chips become obsolete and whether older hardware can be reused for simpler inference tasks.
The report said shortages in power infrastructure, skilled labour and electrical equipment could also slow AI data centre expansion as companies race to build the computing capacity required for large-scale AI systems.
Vadim Taszycki, head of growth at StealthEX, said decentralised physical infrastructure networks can offer substantial cost savings compared with hyperscalers such as Amazon Web Services but remain disadvantaged in low-latency workloads because globally distributed GPUs cannot match the microsecond-level speeds of centralised data centres.
Taszycki said decentralised networks may still compete effectively in batch processing and AI fine-tuning tasks where speed is less critical, despite latency limitations that make them unsuitable for real-time chatbot applications requiring near-instant responses.
Leo Fan, founder of Cysic, argued that verifiability rather than raw performance will ultimately determine whether decentralised AI infrastructure gains long-term traction in enterprise and financial applications.
“The hard problem isn’t distributed compute but discovery, scheduling, and attestation. The wedge isn’t price-per-token; it’s verifiability,”
Fan said.
Industry participants also highlighted growing interest in onchain credit markets as a funding mechanism for AI infrastructure, with platforms such as Maple and Centrifuge potentially enabling smaller syndicated AI infrastructure loans that are often overlooked by large private credit firms.