Google flags 12-year power grid waits as the new choke point for its AI data centers

January 15, 2026
Google flags 12-year power grid waits as the new choke point for its AI data centers

SAN FRANCISCO, Jan 15, 2026, 08:32 PST

  • Google now identifies U.S. transmission delays as its biggest hurdle in expanding new data centers
  • A senior executive reveals that certain utilities are quoting wait times exceeding ten years for interconnection studies
  • The company is exploring “co-location” sites adjacent to power plants to reduce wait times

Google points to the U.S. transmission system as the biggest bottleneck for launching new data centers, with wait times for grid connections stretching beyond a decade in some regions. “Transmission barriers are the number one challenge we’re seeing on the grid,” said Marsden Hanna, Google’s global head of sustainability and climate policy. He noted one utility citing “12 years to study the interconnection timeline.” To dodge some of the delays, Google is looking into “co-location” — situating data centers adjacent to power plants, Hanna added. 1

The timing couldn’t be more challenging for the industry. The Energy Information Administration revealed this week that U.S. power consumption is set to break records in 2026 and 2027, driven by data centers supporting artificial intelligence and cryptocurrency. They project total electricity demand will climb to 4,256 billion kilowatt-hours in 2026, up from a record 4,198 billion kWh in 2025. 2

Google and its rivals are now scrambling for power in a grid designed for gradual expansion. Building new transmission lines involves lengthy permitting and construction processes, while the grid is already congested near major data center hubs. The outcome? A contest between chip racks and pylons—and right now, the pylons are falling behind.

Hanna emphasized that the solution isn’t a single fix. He highlighted permitting delays for new transmission and technologies that boost power capacity on current lines, among other measures. Still, he noted Google usually favors linking to the wider grid rather than running independently.

Co-location is emerging as the go-to workaround. Simply put, it involves placing a major electricity consumer right next to a generator, allowing direct power draw and potentially avoiding some costly transmission upgrades. This approach can speed things up, but it also sparks a tricky debate: who foots the bill, and who ends up sidelined?

Regulators are pushing ahead with new rules, at least within the mid-Atlantic PJM Interconnection region. The U.S. energy regulator has instructed PJM to establish standards for large loads situated near power plants. FERC Chair Laura Swett described the order as a “monumental step” for the AI era, while PJM confirmed it is currently reviewing the decision. 3

Other major tech companies are also moving to address concerns about the rising costs of powering data centers. On Tuesday, Microsoft announced a new initiative aimed at reducing water consumption at its U.S. data centers and preventing the public from bearing the brunt of any spike in electricity prices. The company pledged to pay utility rates that fully cover its own power expenses. Brad Smith, Microsoft’s Vice Chair and President, described it as “unfair and politically unrealistic” for the industry to expect consumers to absorb additional electricity costs driven by AI. 4

Earthjustice has urged Louisiana’s utility regulators to scrutinize the financing behind Meta’s $27 billion data center project, warning that local residents and businesses might end up footing the bill if Meta pulls out early. “If Meta ends the lease after four years,” said Earthjustice attorney Susan Stevens Miller, “almost none of the costs” would have been covered by the company at that point. Meta and Blue Owl did not immediately respond to requests for comment, Reuters reported. 5

Speed is the immediate challenge for Google. The data centers powering AI training and operations demand constant, high energy, and any hold-ups in the grid queue can delay adding more computing power — a serious concern as competitors ramp up rapidly.

But the co-location approach isn’t without risks. In areas already concerned about reliability and rising power bills, it’s a politically charged issue. New regulations might make it harder for big customers to sidestep the shared grid or change how costs are allocated. If regulators push back or utilities push harder, Google could end up stuck in the same queue—just battling a different challenge.