Sunday, September 28, 2008

Essential Complexity versus Accidental Complexity

This axiom by Neal Ford[1] on the 97 Things wiki:

It’s the duty of the architect to solve the problems inherent in essential complexity without introducing accidental complexity.

should be etched on to the monitor of every designer/architect.

The reference is to software architecture, but the axiom really applies to designing and building systems in general. The concept expressed in the axiom is really a special case of ‘build the simplest system that solves the problem', and is related to the hypothesis I proposed in Availability, Complexity and the Person Factor[2]:

When person-resources are constrained, highest availability is achieved when the system is designed with the minimum complexity necessary to meet availability requirements.

Over the years I’ve seen systems that badly violate the essential complexity rule. They’ve tended to be systems that were evolved over time without ever really being designed, or systems where non-technical business units hired consultants, contractors or vendors to deliver ‘solutions’ to their problems in an unplanned, ad-hoc manner. Other systems are ones that I built (or evolved) over time, with only a vague thought as to what I was building.

The worst example that I’ve seen is a fairly simple application that essentially manages a lists of objects and equivalencies (i.e object ‘A’ is equivalent to object ‘B’) and allows various business units to set and modify the equivalencies. The application builds the lists of equivalencies, allows business units to update them and push them out to a web app. Because of the way that the ‘solutions’ were evolved over the years by the random vendors and consultants, just to run the application and data integration it requires COBOL/RDB on VMS; COBOL/Oracle/Java on Unix; SQL server/ASP/.NET on Windows; Access and Java on Windows; DCL scripts, shell scripts and DOS batch files. It’s a good week when that process works.

Other notable quotes from the axiom:

…vendors are the pushers of accidental complexity.


Developers are drawn to complexity like moths to flame, frequently with the same result.

The first quote relates directly to a proposal that we have in front of us now, for a product that will cost a couple hundred grand to purchase and implement, and likely will solve a problem that we need solved. The question on the table though, is ‘can the problem be solved without introducing a new multi-platform product, dedicated servers to run the product, and the associated person-effort to install, configure and manage the product?’

The second quote also applies to people who like deploying shiny new technology without a business driver or long term plan. One example I’m familiar with is a virtualization project that ended up violating essential complexity. I know of an organization that deployed a 20 VM, five node VMware ESX cluster complete with every VMware option and tool, including VMotion. The new (complex) environment replaced a small number of simple, non-redundant servers. The new system was introduced without an overall design, without an availability requirement and without analysis of security, cost, complexity or maintainability.

Availability decreased significantly, cost increased dramatically. The moth met the flame.

Perhaps we can generalize the statement:

Developers Geeks are drawn to complexity like moths to flame, frequently with the same result.

A more detailed look at essential complexity versus accidental complexity in the context of software development appears in The Art of Unix Programming[3].

  1. Neal Ford, “Simplify essential complexity; diminish accidental complexity [97 Things] ”, 97 Things,
  2. Michael Janke, “Last In - First Out: Availability, Complexity and the Person-Factor,”
  3. Eric Steven Raymond, The Art of Unix Programming ( Eric Steven Raymond, 2003), pp 339-343.


  1. Hi,
    I have few questions.
    How to know whether the complexity is essential or accidental?

  2. Here's how I look at it:

    Essential complexity is the minimum complexity that is intentionally designed into a system to meet the requirements that determine te design of a system.

    Accidental complexity is that which is added to a system without a clear requirement for the additional complexity.

  3. Hi Michael,

    Thanks for the clarification.

    so if we introduce new functionality based on the essential complexity (requirement) can we say are we introducing additional complexity?

  4. Yes, in that case you would be introducing additional complexity, but because it is part of a requirement, it is necessary complexity (essential).

    For example, if the availability requirement for a system is 99% availablity during business hours, adding clustering and load balancing would be non-essential complexity (Accidental). The availability requirement can be met without the complexity of redundant systems.

    But then if in the future, the availability requirement were to change to 99.9% 24 x 7, clustering and other high availability technology would become essential complexity.

  5. this badly reminds me of this quote from Tristan Bernard which roughly translates like this:
    Marriage allows two people to join efforts to solve problems they would never have ran into separately.

  6. Hmm!

    Now I am little bit more clear.

    The bottom line is try to scope or put boundaries around the actual requirement which is essential complexity and as soon as you feel we need to provide this or that functionality with the actual it will start creating additional complexity.

    That being said we can say that the bigger the scope or the more you provide over expectations may/will also result into additional complexity.

    It's like problem of choices. The more you provide it might be possible the more it will open the door for additional complexity. So before providing additional feature or functionality you have to carefully evaluate the problems it might invite in the future.

    Isn't it?