top of page

Oopsie!


People make mistakes. Lots of them. We’re very good at making them, but not so good at preventing them.

It can be fun to go down the rabbit-hole of variations and corollaries of Murphy’s Law (https://en.wikipedia.org/wiki/Murphy%27s_law), but my favourites are “Finagle’s Law”:

“The perversity of the Universe tends towards a maximum.”

and “Hanlon’s razor”:


“Never attribute to malice that which is adequately explained by stupidity”

The more we learn about the human mind and how it works (https://www.til-technology.com/post/turtles-all-the-way-down), the more we should recognize that we need systems of thinking and behaving, and work to develop methods by which we can mitigate the risks of our flawed thinking.


Tools like critical thinking (https://www.til-technology.com/post/the-answer) and the scientific method are a great place to start, but a lot of it is simply about being careful.


The biggest problem is that people can just be so aggressively stupid sometimes, and it’s sometimes hard to be sympathetic. Other people, that is... we generally assume stupidity or malice when others make mistakes, and make excuses or conceal it when we make mistakes ourselves.


“If at first you don't succeed, destroy all evidence that you tried.”

What we really need to do is be recognize that everyone makes mistakes, sympathize with them, and try to figure out ways to reduce the risk and impact. But how do we do that?


Not really that hard, actually. It’s really all covered by the rules I outlined previously (https://www.til-technology.com/post/everybody-wants-to-rule-the-world). Here’s how:


Rule 1: “Don’t be a jerk”

If you simply recognize that people make mistakes, and are sympathetic to that fact, people will be less resistant to acknowledging an error. This applies to yourself as well, and a good way to demonstrate this is simply to ask others for help. “Hey, would you please have a look at this before I send it out?” That simple request demonstrates that you are aware of your own limitations, that you value the opinion of your associates, and that you trust them enough to potentially expose your own “dumb” mistakes.


It’s also important to apply this to yourself as well. When you make mistakes, learn from them and move on. Don’t beat yourself up about it.

Rule 2: “Just keep swimming”

Can we eliminate errors? Obviously not. Can we reduce them by taking some simple steps? Easy. Re-read that document. Use a spel-cheker (sic!) and/or gramr-cheker (sic!). Double-check your sources. Don’t assume things – this one is important. I have often noted cases where I “knew” something that turned out not to be true, so I always try to check my facts, even where I’m talking about things that “everyone knows”. As an example, when I “finish” a blog-post, I generally leave it for a few hours before re-reading it (at least twice) to tweak it prior to posting. That certainly won’t catch everything, but it really helps!

Rule 3: “Authority = Responsibility”

When you say something, be responsible for it. Take the effort to ensure that it’s as correct as you can make it, and fix it when it’s wrong. That’s how you build credibility and trust.

So, what sort of errors are we talking about here, anyhow?

Any, really. Errors range from the trivial to the catastrophic and from the “obvious” to the incredibly subtle. Let’s look at an interesting example:

The Mars Climate Orbiter (https://en.wikipedia.org/wiki/Mars_Climate_Orbiter) was launched on 11-Dec-1998, and went out of radio contact on 23-Sep-1999. The root cause of the issue was a software issue – not a coding error, as all of the software worked correctly, but rather an interface error. The issue was that two systems used by the navigation system used different units. The first system calculated the total impulse produced by the thrusters, while the other calculated the trajectory and updated the predicted position of the spacecraft.

Both systems were supposed to use newton seconds, but the thruster calculation used pound-force seconds instead. Simply put, the calculations were off by a factor of 4.45, and too much thrust was applied, leading to the loss of the spacecraft. (This is analogous to driving an old car with a speedometer using miles per hour, and driving on a highway with a speed limit of 100 kilometres per hour.)

This could certainly be considered a “stupid” mistake, but how can you catch this sort of thing, especially when you are dealing with a large number of large teams working for different companies? In theory, you could identify the error in any of a number of ways, but unless you’re looking for just this sort of issue, it’s easy to overlook. (I’ll discuss the importance of formal testing at some point, I expect...)

It was at this point I started down a rabbit-hole of lists of major military mistakes (https://www.factinate.com/instant/33-of-historys-most-unbelievable-screwups/amp/), such as Napoleon’s invasion of Russia or the Battle of Karánsebes (where two Austrian divisions attacked each other). And then there are lists of business mistakes (https://www.businesspundit.com/25-major-company-screwups/) like New Coke, and wonderful engineering mistakes (https://wonderfulengineering.com/31-engineering-mistakes-that-make-you-wonder-who-gave-them-engineering-degrees/), and don’t even start on the Darwin Awards (https://darwinawards.com/) or the endless memes...

The main thing is that an enormous number of problems can be avoided if people just think, just double-check, just ASK. Once we have that, we can start on the problem of the way many organizations respond to issues – if you encounter a culture in which speaking up is discouraged, that’s a big red flag.

We should always try to be positive and supportive, but some things you just HAVE to laugh at...


Cheers!


NOTE: ‘Fraid I missed a week. I’ve been trying to build up a backlog of posts for such occasions, but it’s been a very busy week, and most of my “homework” time has been spent on updates to my Kanban board. (Still on PHP for the moment, as I am still learning a lot and getting good use out of it.) RG

コメント


bottom of page