Worse is better

:: computer

In 1990, Richard Gabriel gave a talk from which Jamie Zawinski later extracted a section called ‘worse is better’ which he distributed widely. It’s strange but, perhaps, interesting, how prescient this idea was.

The paper describes two approaches to design1.

The Right Thing

  • Designs must be simple, both in implementation and interface. It is more important for the interface to be simple than the implementation.
  • Designs must be correct in all observable aspects. Incorrectness is simply not allowed.
  • Designs must be consistent. A design is allowed to be slightly less simple and less complete to avoid inconsistency. Consistency is as important as correctness.
  • Designs must be complete and cover as many important situations as is practical. All reasonably expected cases must be covered. Simplicity is not allowed to overly reduce completeness.

Worse Is Better

  • Designs must be simple, both in implementation and interface. It is more important for the implementation to be simple than the interface. Simplicity is the most important consideration in a design.
  • Designs must be correct in all observable aspects. It is slightly better to be simple than correct.
  • Designs must not be overly inconsistent. Consistency can be sacrificed for simplicity in some cases, but it is better to drop those parts of the design that deal with less common circumstances than to introduce either implementational complexity or inconsistency.
  • Designs must cover as many important situations as is practical. All reasonably expected cases should be covered. Completeness can be sacrificed in favor of any other quality. In fact, completeness must be sacrificed whenever implementation simplicity is jeopardized. Consistency can be sacrificed to achieve completeness if simplicity is retained; especially worthless is consistency of interface.

Today

Today I felt it necessary to complain about a particularly stupid bit of behaviour in a filesystem, & I wrote, without conscious thought

[…] that something like this is even possible in 2018 means that, really, the sort of computing environment which seemed like it would happen in 1980 and still seemed possible into the late 1990s is just dead: worse is not just better, worse has taken better, killed it, buried it in a pit and erased any memory that it ever existed.

Of course no-one is listening (none of the people I sent this to even would have recognised the term I expect) just as no-one, or no-one who counted, listened to the original paper. But everything that is making modern computing systems so horrible — all the hardware bugs, all the systemic insecurity that is going to cost us very dearly if it hasn’t already, all of it — is because no-one listened and worse won by default as a result.

Today few people even remember that there was once an option to do things a better way. Soon, no-one will.

Oh, well.


  1. These descriptions are stolen almost directly from the original: any errors I have introduced by rewording things are my own.