OpenAI Chief Executive Sam Altman said artificial intelligence could eventually be sold like electricity or water, with users paying based on how much they consume, as the industry races to expand the computing power needed to meet rising demand.
Speaking at BlackRock’s infrastructure summit in Washington, Altman described a future in which AI is delivered on demand rather than sold as a fixed product, with access measured through usage much like a metered utility.
A Meter Behind Every AI Task
Altman said the long-term business model for AI providers may resemble the way utilities are sold, with customers billed according to consumption.
He noted that companies like OpenAI already operate in a similar way by charging through digital tokens, the units used to measure how much text or data an AI system processes. That token-based structure, he suggested, could become the foundation for how AI is bought and sold more broadly.
In that model, intelligence would become a service people and businesses draw on whenever needed, paying only for what they use.
The Real Constraint is Infrastructure
Altman said the spread of AI would depend heavily on whether companies can build enough computing capacity to keep up with demand.
That capacity includes the chips, data centers, and electricity needed to train and run large AI models. As demand for AI continues to rise, those physical constraints could shape not only pricing but also who gets access.
He warned that if OpenAI and other model providers fail to build enough compute infrastructure, they may either be unable to offer the service at scale or be forced to charge much higher prices.
That, he said, could leave advanced AI tools concentrated in the hands of those who can afford them, or create a situation where governments may have to decide how limited computing resources are allocated.
Demand Rises Faster Than Supply
Altman’s comments come as technology companies are investing heavily in expanding AI infrastructure, with demand for computing power rising across the industry.
Inside tech companies, access to GPUs and compute budgets has become increasingly valuable as engineers and researchers compete for the resources needed to train and test models.
The broader industry is also facing a growing challenge around power supply, as AI data centers require enormous amounts of electricity, and concerns are mounting that grid capacity, transformer shortages, and slow transmission development could become serious obstacles to future expansion.
Altman said the goal is to move beyond a world constrained by compute shortages and power limits, allowing AI to become a more broadly available service used in everyday life.