How to quantify complexity?

What do we mean when we say A is more complex than B?

Subjective complexity, what we meant when we say a system is complex, has several aspects. First, ignorance, or that we do not know all the laws or properties. For example, the solar system appeared to be much more complex before the discovery of gravity. Second, lack of resources. Even if we know all laws and properties, understanding a system of a large number of components may require extraordinary resources, e.g., computational resources or mental power to calculate the trajectory of an asteroid. Third, lack of time as related to the lack of resources. Often there is a time window within which one must reason about a complex system.

Herbert Simon (Architecture of Complexity): "Roughly, by a complex system I mean one made up of a large number of parts that interact in a nonsimple way. In such systems, the whole is more than the sum of the parts, not in an ultimate, metaphysical sense, but in the important pragmatic sense that, given the properties of the parts and the laws of their interaction, it is not a trivial matter to infer the properties of the whole."

Quantifying complexity by decomposition

Due to the lack of formal, mathematical definition of complexity, we humans approximate it with heuristics. In particular, we do so by decomposing the system into interacting components. This implies that any quantification effort would require some understanding about the system itself.

By decomposing a system into components that interact with each other and as a result, producing a conceptual representation of the system, we can quanfify the complexity based on the representation.

There is a paradox with this approach of quantification by decomposition: the more we understood a system, the better we can quantify its complexity. That is, for really complex systems, we may not even have any clue about how complex they are. This paradox has a very deep philosophical root.

There is also an important caveat about quantification by decomposition. Decomposition only implies ``enumeraion.'' That is, we must be able to list all components, their types, and interactions. This also means that the sets of components, types, and interactions are countable. How do we decompose a system whose components are uncountable?

Spatial/structural decomposition

Spatial decomposition is perhaps the most common and natural way to understand a system in its components and their relationships. A graph representation can be constructed in which a node represents a component and an edge represents a relationship.

One can think of a spatial representation as a state descrption. That is, it describes a particular state the system is in.

Temporal decomposition

Temporal decomposition captures the time-varying behavior of the system.

Somehow, we humans like to even represent temporal decomposition with graphs that bear similarity to those representing spatial decomposition.

A computer system is a giant state machine. State as represented by the content of registers, memory and storage. It works by transiting from one state to another.

One can think of a temporal representation as a process description. That is, it describes a process through which the system changes its state.

Quantifying predictability directly

Information theory provides a powerful new way to directly quantify complexity. A system and its components can be considered as random variables. The predicability of a random variable is measured by its entropy. X denotes the random variable, {x1,...,xn} denotes the possible values it may assume with probability {P1,...,Pn}. Entropy H(x)=-Sum(Pi*logPi). H(x) reaches maximum when Pi=1/n; minimum when P1=1 and Pi=0 for i!=1. Entropy is subjective because it depends on what the subject already knows.

Given one random variable, the predicatibility of another is measured by the conditional entropy.

Objective uncertainty: Heisenberge principle

Quantification by description: Kolmogorov complexity

shortest program that describes the system, given the language. For a text, it is shortest program that produces the text.

Recall the opening question ``what do we mean when we say A is more complex than B?''.

Science as seeking the shortest description

The essence of science: ``the task of science is to make use of the world's redundancy to describe that world simply.''---Herbert Simon (Architecture of Complexity)---''By appropriate "recoding," the redundancy that is present but unobvious in the structure of a complex system can often be made patent. The most common recoding of descriptions of dynamic systems consists in replacing a description of the time path with a description of a differential law that generates that path. The simplicity, that is, resides in a constant relation between the state of the system at any given time and the state of the system a short time later.''

Simplicity in the philosophy of science: an interesting survey of why simplicity is often preferred in scientific investigation and how to measure simplicity. The takeaway is: it is not that simple.