Announcement

Collapse
No announcement yet.

For Better, ... or Worse ???

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • For Better, ... or Worse ???

    When Making Things Better Only Makes Them Worse
    Our very attempts to stave off disaster make unpredictable outcomes more likely.
    ...
    Accidents are part of life. So are catastrophes. Two of Boeing’s new 737 Max 8 jetliners, arguably the most modern of modern aircraft, crashed in the space of less than five months. A cathedral whose construction started in the 12th century burned before our eyes, despite explicit fire-safety procedures and the presence of an on-site firefighter and a security agent. If Notre-Dame stood for so many centuries, why did safeguards unavailable to prior generations fail? How did modernizing the venerable Boeing 737 result in two horrific crashes, even as, on average, air travel is safer than ever before?

    These are questions for investigators and committees. They are also fodder for accident theorists. Take Charles Perrow, a sociologist who published an account of accidents occurring in human-machine systems in 1984. Now something of a cult classic, Normal Accidents made a case for the obvious: Accidents happen. What he meant is that they must happen. Worse, according to Perrow, a humbling cautionary tale lurks in complicated systems: Our very attempts to stave off disaster by introducing safety systems ultimately increase the overall complexity of the systems, ensuring that some unpredictable outcome will rear its ugly head no matter what. Complicated human-machine systems might surprise us with outcomes more favorable than we have any reason to expect. They also might shock us with catastrophe.
    ...
    https://www.theatlantic.com/ideas/ar...=pocket-newtab
    TANSTAAFL = There Ain't No Such Thing As A Free Lunch

  • #2
    Feces occurs.

    Biplanes crashed with great regularity back when aircraft systems were as simple as they possibly could be. When you build machines to challenge the laws of physics, sooner or later they will fail, at least on an individual basis.

    Human error, system error, negative event convergence...there's no perfect system. Everything is simply playing the odds.
    Any man can hold his place when the bands play and women throw flowers; it is when the enemy presses close and metal shears through the ranks that one can acertain which are soldiers, and which are not.

    Comment


    • #3
      Originally posted by G David Bock View Post
      When Making Things Better Only Makes Them Worse
      Our very attempts to stave off disaster make unpredictable outcomes more likely.
      ...
      Accidents are part of life. So are catastrophes. Two of Boeing’s new 737 Max 8 jetliners, arguably the most modern of modern aircraft, crashed in the space of less than five months. A cathedral whose construction started in the 12th century burned before our eyes, despite explicit fire-safety procedures and the presence of an on-site firefighter and a security agent. If Notre-Dame stood for so many centuries, why did safeguards unavailable to prior generations fail? How did modernizing the venerable Boeing 737 result in two horrific crashes, even as, on average, air travel is safer than ever before?

      These are questions for investigators and committees. They are also fodder for accident theorists. Take Charles Perrow, a sociologist who published an account of accidents occurring in human-machine systems in 1984. Now something of a cult classic, Normal Accidents made a case for the obvious: Accidents happen. What he meant is that they must happen. Worse, according to Perrow, a humbling cautionary tale lurks in complicated systems: Our very attempts to stave off disaster by introducing safety systems ultimately increase the overall complexity of the systems, ensuring that some unpredictable outcome will rear its ugly head no matter what. Complicated human-machine systems might surprise us with outcomes more favorable than we have any reason to expect. They also might shock us with catastrophe.
      ...
      https://www.theatlantic.com/ideas/ar...=pocket-newtab
      The same fallacious argument has been made from time to time regarding software in general. What happens in reality is that errors increase as developers try to produce complex software without adopting better systems and techniques for monitoring design and development and/or they are trying to bolt on extras to old systems developed using earlier techniques and technology. I suspect something similar has happened at Boeing

      There are many mission critical systems in the world whose origins go back decades and today contain whole chunks of code that no one understands what they are for or do and dare not change. There comes a time when the best thing is to review the requirements specifications, update as necessary and then rewrite the system from scratch using the latest techniques and technology
      Last edited by MarkV; 30 Apr 19, 05:36.
      Human history becomes more and more a race between education and catastrophe (H G Wells)
      Mit der Dummheit kaempfen Goetter selbst vergebens (Friedrich von Schiller)

      Comment

      Latest Topics

      Collapse

      Working...
      X