Matches in DBpedia 2015-10 for { <http://dbpedia.org/resource/System_accident> ?p ?o }
Showing triples 1 to 62 of
62
with 100 triples per page.
- System_accident abstract "A system accident, or normal accident, is an "unanticipated interaction of multiple failures" in a complex system. This complexity can either be technological or organizational, and often has elements of both. A system accident can be very easy to see in hindsight, but very difficult to see in foresight. Ahead of time, there are simply too many possible action pathways.These accidents often resemble Rube Goldberg devices in the way that small errors of judgment, flaws in technology, and insignificant damages combine to form an emergent disaster. System accidents were described in 1984 by Charles Perrow, who termed them "normal accidents", as having such characteristics as interactive complexity, tight coupling, cascading failures, and opaqueness. James T. Reason extended this approach with human reliability and the Swiss cheese model, now widely accepted in aviation safety and healthcare.Once an enterprise passes a certain point in size, with many employees, specialization, backup systems, double-checking, detailed manuals, and formal communication, employees can all too easily recourse to protocol, habit, and "being right." Rather like attempting to watch a complicated movie in a language one is unfamiliar with, the narrative thread of what is going on can be lost. And other phenomena, such as groupthink, can be occurring at the same time for real-world accidents almost always have multiple causes. In particular, it is a mark of a dysfunctional organization to simply blame the last person who touched something.In a December 2012 article in a popular magazine, Charles Perrow writes, "A normal accident is where everyone tries very hard to play safe, but unexpected interaction of two or more failures (because of interactive complexity), causes a cascade of failures (because of tight coupling)."There is an aspect of an animal devouring its own tail, in that more formality and effort to get it exactly right can make the situation worse. For example, the more organizational rigmarole involved in adjusting to changing conditions, the more employees will delay in reporting the changing conditions, and the more emphasis on formality, the less likely employees and managers will engage in real communication. New rules can actually make the situation worse, both by adding a new additional layer of complexity and by reminding employees yet again that they are not to think but are just to follow the rules.Regarding the May 1996 crash of Valujet (AirTran) in the Florida Everglades and the lack of interplay between theory and practice, William Langewiesche writes, "Such pretend realities extend even into the most self-consciously progressive large organizations, with their attempts to formalize informality, to deregulate the workplace, to share profits and responsibilities, to respect the integrity and initiative of the individual. The systems work in principle, and usually in practice as well, but the two may have little to do with each other. Paperwork floats free of the ground and obscures the murky workplaces where, in the confusion of real life, system accidents are born."In a 1999 article primarily focusing on health care, J. Daniel Beckham wrote, "It is ironic how often tightly coupled devices designed to provide safety are themselves the causes of disasters. Studies of the early warning systems set up to signal missile attacks on North America found that the failure of the safety devices themselves caused the most serious danger: false indicators of an attack that could have easily triggered a retaliation. Accidents at both Chernobyl and Three Mile Island were set off by failed safety systems."Perhaps anticipating the concept of system accident, the Apollo 13 Review Board wrote, "It was found that the accident was not the result of a chance malfunction in a statistical sense, but rather resulted from an unusual combination of mistakes, coupled with a somewhat deficient and unforgiving design."".
- System_accident wikiPageExternalLink perrow.PDF.
- System_accident wikiPageID "6759067".
- System_accident wikiPageLength "13567".
- System_accident wikiPageOutDegree "22".
- System_accident wikiPageRevisionID "681820776".
- System_accident wikiPageWikiLink Aviation_safety.
- System_accident wikiPageWikiLink CNN.
- System_accident wikiPageWikiLink Cascading_failures.
- System_accident wikiPageWikiLink Category:Failure.
- System_accident wikiPageWikiLink Category:Safety_engineering.
- System_accident wikiPageWikiLink Category:Systems_engineering.
- System_accident wikiPageWikiLink Charles_Perrow.
- System_accident wikiPageWikiLink Complex_system.
- System_accident wikiPageWikiLink Emergence.
- System_accident wikiPageWikiLink Human_reliability.
- System_accident wikiPageWikiLink Interactive_complexity.
- System_accident wikiPageWikiLink International_Journal_of_Aviation_Psychology.
- System_accident wikiPageWikiLink Journal_of_Contingencies_and_Crisis_Management.
- System_accident wikiPageWikiLink Normal_Accidents.
- System_accident wikiPageWikiLink Nuclear_accident.
- System_accident wikiPageWikiLink Nuclear_and_radiation_accidents_and_incidents.
- System_accident wikiPageWikiLink Opaqueness.
- System_accident wikiPageWikiLink Pearson_Education.
- System_accident wikiPageWikiLink Rube_Goldberg_device.
- System_accident wikiPageWikiLink Rube_Goldberg_machine.
- System_accident wikiPageWikiLink Swiss_Cheese_Model.
- System_accident wikiPageWikiLink Swiss_cheese_model.
- System_accident wikiPageWikiLink The_International_Journal_of_Aviation_Psychology.
- System_accident wikiPageWikiLink Three_Mile_Island_accident.
- System_accident wikiPageWikiLink Tight_coupling.
- System_accident wikiPageWikiLink University_Corporation_for_Atmospheric_Research.
- System_accident wikiPageWikiLinkText "System accident".
- System_accident wikiPageWikiLinkText "system accident".
- System_accident bot "H3llBot".
- System_accident date "October 2010".
- System_accident hasPhotoCollection System_accident.
- System_accident wikiPageUsesTemplate Template:Cite_book.
- System_accident wikiPageUsesTemplate Template:Cite_journal.
- System_accident wikiPageUsesTemplate Template:Cite_news.
- System_accident wikiPageUsesTemplate Template:Cite_paper.
- System_accident wikiPageUsesTemplate Template:Dead_link.
- System_accident wikiPageUsesTemplate Template:Details.
- System_accident wikiPageUsesTemplate Template:Lead_too_long.
- System_accident wikiPageUsesTemplate Template:Over-quotation.
- System_accident wikiPageUsesTemplate Template:Quote.
- System_accident wikiPageUsesTemplate Template:Reflist.
- System_accident wikiPageUsesTemplate Template:Synthesis.
- System_accident subject Category:Failure.
- System_accident subject Category:Safety_engineering.
- System_accident subject Category:Systems_engineering.
- System_accident type Article.
- System_accident type Article.
- System_accident type Discipline.
- System_accident comment "A system accident, or normal accident, is an "unanticipated interaction of multiple failures" in a complex system. This complexity can either be technological or organizational, and often has elements of both. A system accident can be very easy to see in hindsight, but very difficult to see in foresight.".
- System_accident label "System accident".
- System_accident sameAs Systeemongeval.
- System_accident sameAs m.0gmhpy.
- System_accident sameAs Q2328179.
- System_accident sameAs Q2328179.
- System_accident wasDerivedFrom System_accident?oldid=681820776.
- System_accident isPrimaryTopicOf System_accident.