Security Through Adversity?
When we recently moved to a new home, I had to get a lawnmower. Having experience with petrol powered ones, I knew I wanted an electric powered mower. It has worked well and I enjoy not having to smell exhaust fumes. But the Makita designers have made an odd design choice.
The safety handle that needs to be held for the mower to keep running is just incredibly stiff. When using the mower, my hands and fingers quickly start hurting. It’s just really uncomfortable and I’m surprised at how strong grip strength they have in Japan. I’m not the only one who feels this way, and the people I’ve spoken to have their own custom solutions for this. And now I have one too.
After some experiments, I ended up buying a 2.5 € clamp that I press the handle closed with. Now I can mow without finger strain again. While the clamp is quick to remove, it’s obviously not as safe as just holding the handle with my hands. I’m aware of the increase in the risk and consider it acceptable.
Yet I can’t help but wonder why the handle was made so difficult to hold. Maybe it’s to deter children from using the mower? But what it leads to is users like me circumventing it entirely. So this safety feature has ended up making the mower less safe in the end.
This makes me think of a parallel in the IT world. In some systems, the users’ passwords expire periodically. This is done presumably to enhance security, but the end result is the inverse: due to the increased cognitive load introduced, users choose passwords that are easier to remember, reuse old passwords with limited modifications, and possibly write the passwords down. What started as a security feature has not worked and in some cases has made security worse.
The lesson here is to make it natural for users to do the right thing. If your safety or security feature is onerous, users will start to think of ways around it. At best you’re inconveniencing your users, but at worst you’re turning the feature on its head and making the result less secure.