英国论文代写-异常行为的正常化下。航空作家罗恩·拉普(Ron Rapp, 2015)认为，根本原因往往是偏差的正常化，而不是设备的缺陷或简单的自满或捷径。哥伦比亚大学社会学家沃恩博士(2009)将异常行为的正常化定义为从不期望的标准或实践到可接受的渐进过程。反复的越轨行为并没有导致灾难的结果;因此，它被视为组织的社会规范。以BP的Macondo预期井喷为例(Meigs, 2016)，发现工作人员实施了两个必要的压力测试来检查油井的状态。然而，他们在第一次测试中得到了令人担忧的结果。然后，他们又通过其他管道进行了测试，第二次测试是令人满意的。因此，他们认为第一次试验一定是错的。在后来进行的法医分析的帮助下，得出的结论是，通过第二条管道进行的测试可能被堵塞了。因此，整个团队似乎已经向异常行为的正常化投降了，放弃了不符合他们期望的数据。
Simply speaking, when a worker uses a ladder with damaged rung, they might feel more comfortable. He climbs the risky ladder without accident more frequently, thus, he thinks it is safe. Likewise, such normalization can lead to a disaster for an organization. The crashes of the space shuttles Challenger and Columbia were considering this concept as one of the factors of the incident. Dr. Vaughan surveyed the reason behind why NASA allowed when they had some wrong information around the corner. The organizational behaviors that ignored a glaring mechanical anomaly in the space shuttle can gradually be regarded as a normal flight risk, which might make its crew face failure. Unfortunately again, when the shuttle Columbia was returning the atmosphere of the earth, a broken heat shield made the shuttle crashed and caused death of 7 people. This led to the fact that NASA suffered from the normalization of deviance again. This had become the norm when shuttles returned with broken heat shields. It is not just NASA; it can also be applied to the case of BP oil spill. Generally, when anyone heard about the BP oil explosion incident, the first response is that BP must have gone for speed and efficiency and ignored safety and accuracy. Then the deviation gradually started to occur. It is when the shortcut increasingly forms the norm that accidents happen frequently. However, if luck disappears, people get hurt, just like 11 platform workers were killed and 17 others got injured from oil blast. In such situation, the public then think about how “they” caused that explosion.
Based on the analysis above, it is clear that this incident is not contingency, latent errors had existed over a long period of time before it coupled with enabling conditions to cause a huge failure (Tinsley and Madsen, 2011). However, the failure was absolutely preventable if the Macondo well’s entire team had been guided by a tough commitment to safety first. Their system was not stimulated to the target of safety maximization and tended to a trip-and–fall compliance mentality instead of concentrating on the big picture. It showed a system that “forgot to be afraid’’. Therefore, a reasonable recovery mechanism of having well-informed, regular reporting, or just culture seems to be necessary for the entire organization. The Macondo well catastrophe is an organizational incident whose principles were profoundly rooted in rough imbalances between provisions of the system for production and those for safeguard.