Image for Aurora’s storm vulnerability

A week after strong winds led to blackouts for approximately 70,000 customers across the state, Aurora Energy is left mulling how and when to overhaul its flawed emergency communications system.

At the heart of Aurora Energy’s problems a little over a week ago is a phone system which crashed after receiving approximately 5,000 calls from customers attempting to report faults.

Aurora Energy’s problems began when unexpected and sustained strong wind gusts swept across Tasmania but did the bulk of their damage in the south of the state. While a strong wind warning, cautioning that there could be gusts up to 110 km/h across Southern Tasmania, was issued a little after 2pm Sunday week ago, it was too little, too late. As it turned out, the strongest gust of wind at the Bureau of Meteorology’s Battery Point weather station, at 130 km/h, occurred at about the same time the warning was issued. The strong gusty winds blew branches and trees over, bringing power lines down in many areas.

Tens of thousands of customers were left without power, for long periods in some cases. Aurora crews were stretched to the limit by 1300 reported faults over a three day period. And Aurora Energy is left embarrassed by the failure of a core part of the crisis management system.

Could it have been avoided?

In many ways, Aurora were unlucky. While the Sunday had been declared a Total Fire Ban day due to forecast high temperatures and windy conditions the extreme winds weren’t foreseen. While Aurora Energy had crews on standby due to the potential fire risk, the strong winds – and the extent of them across the state – took the electricity retailer by surprise. As the winds occurred on a Sunday in the middle of a long weekend, Aurora Energy found itself short-staffed.

But the biggest single problem was the limitation of Aurora Energy’s communications system.

Aurora Energy advises customers that if there is an outage they should “check with neighbours first, then call to hear all current outage information. If your area isn’t listed, stay on the line to report it.”

When the first reports came pouring in, the system coped. Reported faults were logged into the system, which automatically generated a list of outages onto Aurora Energy’s website. This in turn fed Aurora Energy’s Twitter feed.

As the calls kept on coming, the strain on the system became apparent. Long after incidents had been reported to Aurora Energy, the outages still hadn’t appeared on the outages list on the Internet. Without showing outages, affected customers decided they needed to call in to report outages which appeared not to have been reported.

When reporting an outage, a recorded message is first played listing all the known outages and the estimated duration time. What was designed to prevent the same fault being reported numerous times, became the Achilles heel of the phone system. As the list of recorded message became longer and longer, the time that those calling in spent on the line grew, further increasing strain on the phone system.

Nor could the system discriminate between where people were calling from to play a message tailored to their locality or region. People in Hobart wanting to report local problems were regaled with a string of faults from around the state. Blacked out customers elsewhere in the state had to wait through a long list of outages, most of which were in the south of the state.

As the backlog of calls grew, Aurora’s phone system at first responded by dropping out at the end of the recorded message before the caller could report a fault. Many then simply called back again without knowing that they were pushing the already stretched phone system closer to collapse. Then the phone system collapsed altogether and remained down for just under an hour. Aurora’s phone system has gone done before, though the last occasions was at least several years ago.

All up on the Sunday, the outage hot line received over 5100 calls but almost three-quarters of those callers either hung up before they got to the end of the recorded message or the call wasn’t answered. Undoubtedly some of those who dropped off the calls were simply doing what the system has been designed to do. However, many customers were left frustrated at the failure of a basic system at the very time it should have been working.

With the phone system down, the website outage tracking tool quickly became even more out of date.

Down at street level, police and emergency services found themselves reassuring people stopped by roadblocks that Aurora Energy did in fact know about the downed powerlines. According to Aurora Energy, they logged 638 faults that day.

Getting the power back on was not as simple as it could have been as the designation by the Tasmanian Fire Service of the Sunday as total fire ban day constrained how fast power lines could be checked and reopened.

Stopping the sparks from flying

Aurora Energy’s system is equipped with auto-reclosers which are designed to automatically reset tripped power lines. While effective at quickly restoring power supply from a temporary short-circuit, there can be a downside: sparks. The use of auto-reclosers on high fire danger days was one of the factors examined by the Royal Commission into the devastating 2009 Victorian fires as a potential source of risk.

In its findings, the Royal Commission recommended that auto-reclosers be shut off altogether on a particular class of high-risk low voltage power lines at the height of each summer and that utilities be limited to one automatic restart attempt on high-voltage powerlines during high fire ban days.

With Aurora Energy having voluntarily adopted similar standards, this meant that the affected lines had to be manually inspected before the auto-reclosers could be being reinstated. The downside of this prudent fire risk-management policy was that it takes considerably longer to re-instate power supplies. The limits imposed by this policy only eased when the fire ban effectively expired.

While Aurora crews worked through the night and had many areas connected by the next morning, there was little let up on the reporting of faults. On the Monday of the long weekend another 500 faults were reported. Again the phone system struggled with a little over a third of over 2800 callers hanging up before getting to the end of the recorded message. While the volume of faults tailed off on Tuesday, an extra 185 faults were logged. All up of the three days, Aurora crews had 1300 reported faults to check and fix.

Who’ll fix the flawed system?

Of all the individual elements contributing to the Aurora Energy’s communications system crash, few are uncommon.  What was unusual was that low-frequency events all occurred on the same day.

But all emergency management planners face similar dilemmas on how best to prepare for the low- probability but high-impact events. 

Undoubtedly, sooner or later a similar range of factors will co-incide once more. High-fire danger days are expected to become more frequent as climate change bites harder; high wind events will still be common and occasionally unexpected; the policy limit on the use of auto-reclosers will remain and long weekends are here to stay. This leaves Aurora Energy’s communications system as the biggest single controllable factor left.

Will it be better able to cope next time a low-frequency but high-impact event such as a storm or a bushfire occurs?

Perhaps. Perhaps not. It all depends on when the next extreme weather event to bring down the power lines comes.

While Aurora Energy is still pondering how best to avoid a repeat of last week’s blackouts, it is far from certain that major changes to Aurora Energy’s communications system will be implemented before winter.

Undoubtedly, technical changes can be made to overcome the limitations of the existing system. The capacity of the existing phone system could be increased. One possibility Aurora is considering is the use of a phone system which recognises where customers are calling from and tailors a shorter message of just the local area outages. Exactly how much upgrading the system would cost is not known at this stage, but it is unlikely to be cheap. 

There is a countervailing financial factor: under Aurora Energy’s Supply Reliability Guarantee customers disconnected for over eight hours in urban and semi-rural areas will be paid $80 and $160 if the outage was over 16 hours. The outage duration thresholds are higher in rural areas. (Payments are made by cheque automatically within three months after the outage). At this stage the cost from the blackouts is not known but is likely to be significant.

Decisions on changing the system are, however, unlikely to proceed far until after the completion of the July 1 merger between the high-voltage distribution system operator, Transend, and Auroa Energy’s low-voltage distribution business. The new company, TasNetworks Pty Ltd,  is likely to have the need to upgrade the phone system ticking away in its in-try.

As the impact of the meltdown of Aurora Energy’s phone system recedes, the biggest risk is that the new management of TasNetworks will have their hands full with bedding down the merger and defer making any significant communication system decisions until much later.

If TasNetworks cross their fingers and hope that a low-probability, high-impact event like the recent high winds doesn’t occur any time soon and one does, they are likely to discover that blacked out customers are unlikely to be quite so forgiving if the communication system crashes again.