This is a short set of predictions about where AI adoption could go next year. I’m sharing it because most teams are still planning around capabilities, while the real bottlenecks are starting to show up elsewhere.
I think 2026 is the year AI collides with real constraints.
The story shifts from model capability to capital discipline, power costs, and inference economics. This is where vague AI ambitions start to lose funding to programs that can demonstrate operational value.
Start with capital.
Capital can fund the buildout, but it will still require one key answer: who underwrites massive, rapidly depreciating AI infrastructure if returns don’t materialize on time? Expect more AI-linked financing and more creative structures designed to keep liabilities off balance sheets.
Now add electricity.
Power costs are rising due to structural reasons, not slogans. Underinvestment in grid infrastructure, increasing data center demand, and climate-related losses are all transferring costs to ratepayers. Additionally, upcoming demand from EVs and heat pumps will put further stress on the system. Higher prices make the most compute-intensive methods less economical. This pressure will influence the architecture decisions companies make in 2026.
This is where the LLM business model question becomes unavoidable.
Traditional software scales with low marginal costs. LLM economics differ. More usage often requires more compute and electricity. That creates a linear cost issue labeled as software. If providers raise prices to protect margins, enterprises will stop delegating every simple task to a general-purpose model. They will become more selective and pragmatic.
This is why I expect a shift toward simpler, purpose-built AI.
Smaller, focused models and mixture-of-experts designs will capture more of the work. They cost less, deploy faster, and fit specific workflows.
We will see more companies run open-source models in their own cloud accounts, right next to their systems of record. If you want a practical starting point, avoid tackling your biggest business challenge first. Instead, choose an important operational goal with a clear feedback loop – something that won’t hurt the company if it goes wrong. Focus on work you know is important but often gets delayed because it isn’t urgent.
This approach helps teams build real confidence with AI. Start with a single, well-defined workflow. Make sure the inputs and outputs are clear, assign a clear owner, and track changes in cost, time, quality, or risk.
The next wave won’t be led by the loudest voices, but by people who choose a workflow, use a model, and show it improves decisions, costs, or reliability.
My four predictions for 2026 are straightforward.
- First, rising electricity prices become a central 2026 election issue. The drivers include underinvestment in grid infrastructure, climate losses, long interconnect queues, and higher gas prices from LNG exports. Large tech companies building AI data centers become visible targets for voters and politicians.
- Second, physical and financial constraints mean the AI buildout misses the 2026 and, especially, 2027 supply addition projections. Projects slip or shrink because power, permits, and capital do not line up.
- Third, unit economics force a reset. Frontier model providers push pricing and limits toward what the infrastructure can sustain. At the same time, more workloads move to smaller models, mixture-of-experts setups, and open-source deployments that run inside a company’s own cloud environment, closer to its system of record.
- Fourth, a handful of CEOs in unglamorous sectors publicly question the ROI of broad AI programs and cut a slate of pilots. That forces boards and investors to rethink rich AI valuations and pushes AI within companies toward practical operational use rather than signaling.
Final Word
In the new year, we will talk about what is working, and the mistakes we’ve made, in plain terms. You’ll get the practical takeaways too, what we changed, what improved, and how you can apply the same lessons quickly.
Will