A recent article in The Independent describes a fascinating study.  Researchers found that forcing surgeons in a London hospital to implement a single new procedure caused the death rate after surgery to fall 47 percent.  Complications likewise fell by 36 percent.

These are enormous numbers:

Donald Berwick, the president of the US Institute for Healthcare Improvement, said of the innovation: "I cannot recall a clinical care innovation in the past 30 years that has shown results of [this] magnitude."

The change?  The surgeons were required to run through a simple checklist before and after every procedure.

Now, this checklist [pdf] is not lengthy.  In fact, it's less than twenty questions.  Even more strikingly, the questions are not complicated: before administering anesthesia, has the patient confirmed his identity?  Does the patient have any known allergies?  Have we confirmed that this is the right foot?  I'd wager that the most common reaction of people seeing this checklist is "you mean these things weren't being asked already?"

The thing is, these questions were already being asked.  The difference is that they weren't being asked in systematic way.  Members of a surgical team assume that these obvious things have been checked because they're so obvious.  "Of course this is the right patient."  "Of course we asked if he's allergic to the anesthetic."  "Of course we marked the right limb for amputation."

The problem is, these things are sometimes done incorrectly, or missed entirely.  Not often, not usually; just sometimes.  Forcing doctors and nurses to use the checklist caught these "sometimes" problems, and dramatically improved patient safety in the process.

From this study, we can draw the following conclusions:

  1. Even obvious errors are easy to commit, and once committed, can have terrible consequences.
  2. Those who overlook these problems can be highly-trained and intelligent people.  It appears that you cannot out-train or out-brain our human tendency to err from time to time.
  3. Instituting a rigorous system to check for errors lowers their occurrence dramatically, compared to existing processes that check for them informally.

Laid out this way, these statements are probably starting to look familiar, because they are exactly like the arguments for using unit tests in programming.  As a software developer, I look at this list and think we knew this stuff already!  We knew it, and we came up with a way of mitigating the problems caused by our own flawed humanity.

The errors that this checklist catches are the kind that kill people.  Thousands of people, every year.  If we, as an industry, had somehow conveyed the efficacy of systemic testing with the medical field, this checklist might have been implemented sooner, and lives could have been saved.

The fact that ideas like this don't flow across fields is not surprising.  Industry-specific myopia reflects our human tendency to form into groups and distrust outsiders.  Despite this, I do believe it's worth struggling against.  Perhaps the field of architecture could provide useful insights into the construction of software (perhaps it already has).  Maybe the next big idea in architecture will be inpsired by chemistry, or biology.  Maybe the next time I open my editor I should think "first, do no harm."

It certainly couldn't hurt.