By Jonathan Marshall
PG&E recently received California Public Utilities Commission (CPUC) permission to start the second phase of a multi-year smart grid pilot project that seeks to reduce customer energy usage and losses along electric lines, lower customer bills, support the continued adoption of rooftop solar, and spare the environment.
Those benefits are a lot sexier than the geeky name of the technology: “voltage and reactive power optimization” or “conservation voltage reduction.” Whatever you call it, “the technology is likely to become one of the most popular energy efficiency and demand response measures among North American utilities before the end of 2020,” according to a recent report by Navigant Research, a market analysis firm.
Navigant’s senior research analyst Kristoffer Torvik said the technology “can unleash unprecedented smart grid benefits,” but that North American utilities have yet to take advantage of its full potential, “which often lies latent in the smart meter functionality.”
PG&E, the leading deployer of smart meters in North America, is setting out to seize that potential—but only after a careful period of planning, testing, and retesting to make sure the technology is ready for prime time.
When deployed throughout PG&E’s service area, this technology could potentially save customers tens of millions of dollars a year in energy costs.
How it works
Here’s the basic idea, for non-engineers.
All U.S. utilities are required to maintain line voltages within a certain range (114 to 126 volts) to ensure the proper operation of electrical devices that rely on the power grid. The challenge is that line voltages inevitably drop from the substation to points where customers pull electricity off the distribution “feeder” line. The extent of the drop depends on the size and kind of loads placed on the line, and the length of the line itself.
Utilities place various kinds of equipment at substations and on feeder lines to minimize the voltage loss, but at the end of the day, a common strategy is to start the line toward the high end of the voltage range so the voltage doesn’t fall too low by the end of the line.
That conservative set-and-forget approach works, but it can carry hidden costs.
In particular, if line voltages are higher than they need to be, so is the amount of energy used by equipment at the customer end. Typically, every 10 percent increase in voltage increases total energy use by six to nine percent.
That means higher customer bills, higher utility expenses (which get passed on to customers) to meet peak energy demand, and greater impact on the environment.
Huge potential savings
If utilities could monitor voltages along the line—for example, using smart meters at customer sites—and then remotely lower voltages when possible, they could achieve energy savings of about 57 million megawatt-hours per year nationally with no loss of service quality, according to estimates by the Pacific Northwest National Laboratory in 2010. That’s about two-thirds of the total electricity that PG&E delivers to its customers each year.
There’s another emerging benefit from this kind of smart grid voltage regulation: mitigating service quality problems that result from high concentrations of rooftop solar generation connected to the utility’s distribution system.
Rooftop solar complicates the task of setting line voltages, since local generation can increase line voltages where utilities previously expected them to fall. Moreover, line voltages can swing up or down with the passing of clouds, making set-and-forget strategies inadequate to the task of keeping within the standard voltage range.
“Here at PG&E, we have nearly a quarter of all U.S. rooftop solar installations in our service area,” said Russell Griffith, who is leading PG&E’s voltage and reactive power optimization system pilot project. “Ensuring that we can enable customers to install solar as cost effectively as possible while maintaining grid reliability is reason enough for us to evaluate this promising smart grid technology.”
Over the course of the past year or more, PG&E experts interviewed engineers at seven other utilities that have piloted conservation voltage reduction programs. PG&E then sought information from a host of technology vendors, selecting five for more detailed investigation. Early in 2014, PG&E asked two vendors to supply technology for rigorous testing.
Testing at PG&E’s San Ramon lab
PG&E installed the equipment at the utility’s Applied Technology Services laboratory in San Ramon, which has one of the most advanced electric distribution system test facilities in North America. Tests focused on safety, operations, and systems integration under many scenarios, including wide voltage swings, significant load changes, sudden outages, and loss of field communications.
The “torture” tests uncovered operational issues that might otherwise have gone unnoticed until they jeopardized reliable service for customers. The vendors have corrected the issues that PG&E uncovered and are working to make other improvements.
The next phase involves testing in the field. Over the course of the coming year PG&E plans to deploy this technology on 12 feeder lines serving over 20,000 customers in the Fresno area, to see if it lives up to its promise.
PG&E will continue to report on its findings to the California Public Utilities Commission and share them with the industry, to help disseminate lessons learned and best practices from its pilot study.
Email Jonathan Marshall at firstname.lastname@example.org.