Power Outages and Strategy 1.5
On 3 December 2022 somebody attacked two electrical substations, putting them both out of action for a number of days. 40,000 customers were without power. The culprits had damaged parts which were difficult to repair or replace.
There were claims that the attack was sophisticated, that the attackers knew exactly what to hit with a high powered rifle to cause maximum disruption. How else to explain the amount and length of disruption caused? There were also claims the intention of the attackers was political: to disrupt a local drag festival.
But this is backward reasoning: these were the effects, therefore this must have been the intention. One plausible theory I read made the claim that this was no more than an attempt to cut off local power to burgle a store but with unfortunate collateral damage. Grady Hillhouse’s Practical Engineering YouTube channel has a good video describing the damage done and the work required to restore the grid.
What is certain is that such attacks are on the increase. In November 2022 the FBI warned of white supremacist plots to take down the US power grid, and that the information required to identify vulnerable substation components was being published by various groups. In 2014 a report by the Federal Energy Regulatory Commission warned that attacking just nine of the 55,000 US electrical substations could cause a national blackout.
Most businesses have some sort of plan for what to do when the power fails. The general assumption is that a power cut will be caused by bad weather and be of limited duration. The five day power cut in this case reminds us that power utilities only keep spares for common failure modes - not spares for all possible attacks. Large transformers (such as the one damaged here) are built to order and have long lead times, often over a year. If, as some have suggested, there is a real upward trend in attacks against equipment, the risk of multi-day power cuts is something that should be considered.
There are two basic power outage strategies which I’ve seen in business continuity plans:
- Strategy 1: Install a backup generator so normal operations can continue without grid power. This is expensive, and only economic in some circumstances. Backup generators need to be regularly maintained and tested; even stored fuel requires maintenance.
- Strategy 2: Accept the risk and send everybody home after an hour. Half a day’s lost productivity once every few years may be quite acceptable compared to the costs of providing backup power.
These strategies are commonly combined into a hybrid approach: keep critical functions (e.g. call centers, refrigerators and freezers) working with backup power, but send less critical or more power intensive departments (e.g. marketing, manufacturing) home.
Strategy 2 works well only if we can assume power cuts are rare and of limited duration. But what if, as in this case, they have the potential to last a number of days?
This is where Strategy 1.5 comes in.
- Strategy 1.5: Accept the risk of a short duration outage, but mitigate the risk of a multi-day outage with provision for hooking up a mobile generator if needed.
A permanent backup generator may be too expensive to install and maintain, but provisioning for a mobile generator might not be. Truck or trailer mounted generators can supply up to 2MW at relatively short notice and be shipped long distances as needed. I’ve seen this strategy work well: mobile generator provisioning originally installed to mitigate Y2K risks was used years later to limit disruption during a multi-day power outage.
Does Strategy 1.5 make sense for you? It’s worth investigating the costs and doing a few rough calculations to find out.