Summary: The article A how-to guide for hacking your local utility (and what to do now that it is out there), written on Erich W. Gunther, in Smart Grid News, gives a perfect example of the damaging Normalization of Deviant in the power industry.
Below is a perfect complement of the blog post Smart Grid: SoS "… interacting in unpredictable ways that regulators and investors cannot comprehend, far less control.”, whose summary reads:
Similar to financial markets, system crashes are expected in smart grid, because they have been though to be just complex technological systems, when they are in fact ultra large scale socio-technical systems. The difference between the two kinds of systems is told in “… the story of the London Millennium Bridge, which opened in June 2000 and two days later was closed for two years to remedy estabilizing swaying motions induced when groups of people walked over it.” As industry restructuring was flawed, legislators, regulators, and investors have a change to minimize the damage in the making on the power industry, by learning about their responsibility of the now known error of the Normalization of Deviance before it is too late.
In the article A how-to guide for hacking your local utility (and what to do now that it is out there), written on Smart Grid News, Erich W. Gunther deserves congratulations! He gives an example of the Normalization of Deviant. To understand the term, next see a segment of an interview I found on the Internet.
Deviance, normalization of deviance. What was exactly that normalization of deviance in the case of NASA?
Diane Vaughan: Social normalization of deviance means that people within the organization become so much accustomed to a deviant behaviour that they don’t consider it as deviant, despite the fact that they far exceed their own rules for the elementary safety. But it is a complex process with some kind of organizational acceptance. The people outside see the situation as deviant whereas the people inside get accustomed to it and do not. The more they do it, the more they get accustomed. For instance in the Challenger case there were design flaws in the famous « O’rings », although they considered that by design the O-rings would not be damaged. In fact it happened that they suffered some recurrent damage. The first time the O-rings were damaged the engineers found a solution and decided the space transportation system to be flying with « acceptable risk ». The second time damage occurred, they thought the trouble came from something else. Because in their mind they believed they fixed the newest trouble, they again defined it as an acceptable risk and just kept monitoring the problem. And as they recurrently observed the problem with no consequence they got to the point that flying with the flaw was normal and acceptable. Of course, after the accident, they were shocked and horrified as they saw what they had done.
Subscribe to Grupo Millennium Hispaniola by Email