The race to build AI infrastructure is no longer just a supply challenge – it’s becoming a capital markets problem. A new financing model from Forum Markets, which plans to deploy capital into short-term loans for Nvidia GPU purchases and potentially tokenize those loans on-chain, highlights how far the industry is moving beyond traditional funding structures.
The company is targeting annualized returns in the mid-teens, on 60- to 120-day bridge loans tied to AI chip deployments. That mix of high yields, short durations, and alternative distribution via tokenization signals a market still struggling to align surging demand for AI compute with stable, long-term financing.
The shift builds on emerging signs of strain in AI infrastructure funding. In a previous Data Center Knowledge report, lenders and brokers described neocloud deals slowing or stalling as credit concerns mounted, particularly for newer operators without long-term customer contracts. The rise of short-term, high-yield bridge loans suggests the market is adapting to those constraints rather than resolving them.
That financing strain is unfolding alongside what HyperFrame Research CEO Steven Dickens describes as an “execution gap” in enterprise AI.
“We are currently witnessing what we call the ‘execution gap’ – the chasm between having the raw silicon to experiment and the infrastructure velocity required to reach production,” Dickens told Data Center Knowledge.
His firm’s recent analysis found only 22.8% of enterprise AI projects launched in the past 12 months have reached production and met their original return-on-investment goals.
Bridge Loans Fill a Growing Gap
AI infrastructure operators often need to secure and deploy GPUs before facilities are fully operational and generating revenue. This creates demand for short-term bridge loans that are repaid once longer-term financing is in place.
Forum, a digital asset platform that is moving into AI chip infrastructure financing, said it plans to commit up to $50 million to an initial US neocloud deal, with each loan structured around a pre-arranged takeout from a term lender once infrastructure is live.
Forum CEO McAndrew Rudisill said the structure is designed to reduce risk by ensuring repayment is effectively pre-arranged before capital is deployed.
“Forum does not deploy capital without a defined, pre-arranged exit already in place,” Rudisill told Data Center Knowledge. He noted that each transaction includes a pre-committed takeout from an institutional lender, with funds held in escrow and released automatically once the GPU infrastructure is operational.
Rudisill added that borrowers prepay interest and contribute equity to the transaction, creating additional safeguards during the short bridge period.
Bridge financing is not a new concept. Its expanding role in AI buildouts reflects a deeper issue: permanent capital is not keeping pace with the speed (or perceived risk) of GPU deployments.
Dickens said the constraint has already shifted.
“In 2024, ‘access’ was the only metric that mattered,” he said. “Today, we see a massive pivot toward infrastructure velocity.”
Enterprises are now prioritizing not just whether GPUs are available, but also how quickly they can provision, iterate, and scale AI workloads.
High Yields Reflect Rising Risk
The mid-teen returns Forum is targeting stand in sharp contrast to traditional infrastructure debt, which typically lands in the mid-single digits. That spread reflects more than just short durations, as lenders are also pricing in risks tied to borrower credit quality, deployment timelines, and GPU collateral value.
Unlike traditional data center assets, GPUs depreciate quickly, which makes them dependent on rapid deployment into revenue-generating environments. Delays in power, construction, or customer onboarding can quickly erode expected returns.
Rudisill said the loans are structured as senior secured, with Nvidia hardware held in special purpose vehicles to isolate collateral and simplify enforcement.
“GPU assets are controlled at the deal level rather than commingled with the borrower’s broader balance sheet,” he said.
Dickens said many enterprises are still struggling with foundational gaps that complicate scaling.
“The biggest roadblocks aren’t the models themselves, but the foundations,” he said, pointing to fragmented data architectures and governance gaps that slow production deployment.
Takeout Financing Becomes the Pressure Point
The viability of these bridge loans depends on the assumption that permanent financing will be available when needed. Each transaction in Forum’s model includes a pre-committed takeout from a term lender. But across the market, that assumption is under increasing scrutiny.
As previously reported, some AI infrastructure deals – particularly those involving newer neocloud operators – have slowed or stalled as lenders take a more cautious view of credit risk and long-term utilization. If that caution persists, the bridge-to-permanent financing pipeline could become a bottleneck, leaving lenders exposed to assets that may be difficult to refinance.
Forum also plans to tokenize portions of these loans, offering exposure to AI infrastructure credit through its platform. This approach could expand the investor base for infrastructure-linked yield, but it also introduces new questions around transparency, liquidity, and risk distribution.
Rudisill said tokenization allows the company to convert its position into tradable digital instruments, giving accredited investors access to short-duration infrastructure credit with a defined maturity profile.
The 60-120 day structure and pre-committed takeout create a predictable cash flow timeline, offering more certainty than longer-term credit structures. Investors are effectively underwriting short-term credit tied to rapidly evolving hardware and dependent on execution timelines and follow-on financing.
A Market Under Pressure to Mature
The emergence of high-yield bridge loans and tokenized credit structures suggests the AI infrastructure market is entering a new phase – less defined by hardware scarcity and more by execution and capital efficiency.
Dickens said that shift is already driving consolidation, and providers that fail to move up the stack into full-service platforms risk margin pressure as supply improves.
He added that inference workloads will further raise the bar, requiring low-latency networking and real-time observability as enterprises move toward more dynamic AI applications. The result is a market where both infrastructure and financing must evolve in tandem.
If long-term capital fails to keep pace, the industry may continue to rely on short-term, higher-cost funding to sustain growth – exposing deeper fragilities in how AI infrastructure is being built and financed.
The challenge is no longer just deploying GPUs, but also building a capital stack that can support them at scale.
“The era of the ‘naked GPU reseller’ is effectively over,” Dickens said.
