But for the rate payers inadvertently stuck with financing the grid upgrades needed to support the power-hungry centers, the road ahead lacks policy direction, experts warn.
The Little Hoover Commission, an independent oversight body that advises the governor and Legislature on government operations and policy, has opened a study and a series of hearings on these impacts. The first hearing was held Nov. 20.
During the proceedings, three witnesses gave testimony about the short- and long-term impacts data centers will have on Californians.
Elise Torres, managing attorney with The Utility Reform Network (TURN), highlighted PG&E projections that grid demand in its service area will grow by 85 percent in the next few years, jumping from 11.3 gigawatts in 2024 to an estimated 10 gigawatts.
“We're, at TURN, really concerned about the potential for rate increases, particularly because there's just such a big increase in load, and that load is going to require a lot of new infrastructure investment and all of that investment will be capital spending that will go into the utilities rate base and will be paid for by all customers for the entire life of the infrastructure,” Torres said. “For transmission infrastructure, that's 30 to 50 years that we'll be paying for that and also paying the utilities, if they're investor-owned utilities, a rate of return on it.”
Part of the problem, Torres noted, is the lack of a scalable rate structure for the types of operations. Currently, high-energy-consumption industrial operations in PG&E’s service area are subject to B20T commercial rates, but Torres said these rates have not scaled to meet the energy demand seen from data centers.
The utility has been pushing for Rule 30 approval with the California Public Utility Commission (CPUC), which, if approved, would establish new tariffs for large transmission-level customers (i.e., those that tie directly into transmission infrastructure), but Torres advocated for more data center-specific fee structures.
“We need a data center-specific tariff or rate schedule for these customers. So the Rule 30-type contract, that's important too, we need binding contract terms, but we also need a rate that collects a reasonable share of the costs related to all kinds of societal benefits. We also need to consider … their impact on the grid,” Torres said, noting that California rate payers are already struggling with energy costs.
While the need for data center operators to pull their financial weight was universally accepted throughout the hearing, Stanford University’s Dr. Liang Min cautioned that other states are aggressively pursuing data center customers with fewer regulatory barriers than California.
“California is the world leader on artificial intelligence. The big question now is, can we also lead energizing and powering AI?” Min said, adding that neighboring states have fewer permitting requirements and faster grid upgrades. “When the data center leaves, [it's] very likely the AI workforce research jobs may leave as well. And this is a real risk for the state of California.”
Min made recommendations the state and operators could take to solidify California’s innovation lead. They were:
- Streamline interconnection and permitting
- Enhance grid utilization and reliability
- Harness AI demand flexibility
- Accelerate clean, firm generation for data centers
- Advance technologies, business models and financing
Linda Gordon, supervising attorney with the University of California, Berkeley’s Investigations Lab and climate researcher at the Human Rights Center, and AllAI founder Masheika Allgood testified to some of the societal implications data centers have on the communities around them.
Gordon said that hyperscale data centers come with several concerning implications for residents, including health and air quality concerns, the potential for water shortages, displacement and more.
“Regarding the right of access to information: Information is critical to realize Californians' freedom of expression and meaningful public participation in this data center boom. Allowing facilities to operate without even tracking or reporting their own energy and water consumption makes mitigation harder to plan and makes government oversight and public access to information less effective or even possible,” Gordon said.
Allgood testified that there are concerning transparency practices around topics such as water and energy use that need to be addressed for communities to have a better picture of the impacts posed by these operations. Some use non-disclosure agreements with utilities and local governments to keep this data concealed and then sue if it is released.
“You've got some of your general hyperscalers — your older, more traditional data center operators — who do provide a bit more information, but this is the playbook for the AI data center operators,” Allgood said.
Allgood also noted that any efforts to push operators to scale back energy use during peak times would be difficult, especially in situations where operators are beholden to service demands of outside companies leasing their data center space.
Natalie Frick, with Lawrence Berkeley National Laboratory, testified that there are a number of factors that drive tariff implementation in this space, including resource adequacy, affordability, the desire to attract large load customers and air pollution reduction efforts.
When it comes to reducing the risk to utilities and everyday ratepayers, Frick said several steps can be taken, including upfront operator payments and exit fee agreements, credit rating and collateral requirements, contract duration and size adjustments, and minimum load requirements and demand charges.
The whole environment is subject to substantial lobbying pressure, Chairman Pedro Nava acknowledged in his comments during the proceedings, which complicate the policymaking process.
“I get a little concerned with this rush to abundance that we've been confronted with recently, that the answer to everything is just build it and goddamn it, whatever happens, it's worth it,” Nava said. “And I think what we're learning little by little is there is reason to be somewhat cautious and responsible in a conversation about AI and the centers and how much energy and water it's going to take to build and run those operations.”
Commissioner David Beier said in his comments that policymaking in this space needs to happen within the Legislature, not the CPUC, as it extends far beyond simple rate structure conversations.
“One of the clear messages from today's testimony, for me at least, is that it is the Legislature [that] needs to deal with these issues comprehensively, not the Public Utility Commission, because this is beyond electricity rates and it has broader societal implications,” Beier said.
The second hearing in this series has been scheduled for Dec. 11.