Essential Complexity versus Accidental Complexity

This axiom by Neal Ford[1] on the 97 Things wiki:

It’s the duty of the architect to solve the problems inherent in essential complexity without introducing accidental complexity.

should be etched on to the monitor of every designer/architect.

The reference is to software architecture, but the axiom really applies to designing and building systems in general. The concept expressed in the axiom is really a special case of ‘build the simplest system that solves the problem', and is related to the hypothesis I proposed in Availability, Complexity and the Person Factor[2]:

When person-resources are constrained, highest availability is achieved when the system is designed with the minimum complexity necessary to meet availability requirements.

Over the years I’ve seen systems that badly violate the essential complexity rule. They’ve tended to be systems that were evolved over time without ever really being designed, or systems where non-technical business units hired consultants, contractors or vendors to deliver ‘solutions’ to their problems in an unplanned, ad-hoc manner. Other systems are ones that I built (or evolved) over time, with only a vague thought as to what I was building.

The worst example that I’ve seen is a fairly simple application that essentially manages a lists of objects and equivalencies (i.e object ‘A’ is equivalent to object ‘B’) and allows various business units to set and modify the equivalencies. The application builds the lists of equivalencies, allows business units to update them and push them out to a web app. Because of the way that the ‘solutions’ were evolved over the years by the random vendors and consultants, just to run the application and data integration it requires COBOL/RDB on VMS; COBOL/Oracle/Java on Unix; SQL server/ASP/.NET on Windows; Access and Java on Windows; DCL scripts, shell scripts and DOS batch files. It’s a good week when that process works.

Other notable quotes from the axiom:

…vendors are the pushers of accidental complexity.

And

Developers are drawn to complexity like moths to flame, frequently with the same result.

The first quote relates directly to a proposal that we have in front of us now, for a product that will cost a couple hundred grand to purchase and implement, and likely will solve a problem that we need solved. The question on the table though, is ‘can the problem be solved without introducing a new multi-platform product, dedicated servers to run the product, and the associated person-effort to install, configure and manage the product?’

The second quote also applies to people who like deploying shiny new technology without a business driver or long term plan. One example I’m familiar with is a virtualization project that ended up violating essential complexity. I know of an organization that deployed a 20 VM, five node VMware ESX cluster complete with every VMware option and tool, including VMotion. The new (complex) environment replaced a small number of simple, non-redundant servers. The new system was introduced without an overall design, without an availability requirement and without analysis of security, cost, complexity or maintainability.

Availability decreased significantly, cost increased dramatically. The moth met the flame.

Perhaps we can generalize the statement:

Developers Geeks are drawn to complexity like moths to flame, frequently with the same result.

A more detailed look at essential complexity versus accidental complexity in the context of software development appears in The Art of Unix Programming[3].


  1. Neal Ford, “Simplify essential complexity; diminish accidental complexity [97 Things] ”, 97 Things,
  2. Michael Janke, “Last In - First Out: Availability, Complexity and the Person-Factor,” http://blog.lastinfirstout.net/2008/03/availability-complexity-and-person.html.
  3. Eric Steven Raymond, The Art of Unix Programming (http://www.catb.org/esr/writings/taoup/html/graphics/taoup.pdf: Eric Steven Raymond, 2003), pp 339-343.