Don't be afraid of butterflies

2022-12-22

"Chaotic" doesn't mean "unpredictable" (just like being unlikely doesn't make something a black swan). The informal version of the mathematical definition simply requires a positive Lyapunov exponent; in other words, two arbitrarily close states will diverge exponentially over time, and because you can't really know the precise state where you are in most useful contexts, any prediction you make is going to get exponentially wrong over time.

So far so chaotic.

However, what makes engineering interesting (including cognitive engineering) is that, unlike in most mathematics, constants matter. "Things get exponentially wrong over time" sounds bad, and it is, but on the other hand if the initial difference is small enough and the exponent is small enough and you only need predictions for a short-enough horizon, it's perfectly fine. The only real constraint imposed by exponential divergence is that you don't get second chances - the difference between "almost too late" and "way too late" can be very small.

You think of it as a linearization trick for chaotic systems: whenever you have to deal with one, simply shift the characteristic time scale in which you perceive, decide, act, and benefit from the action to one short enough that divergences are manageable (just make sure you never miss). The word "simply" here carries a lot of weight — in many cases the only way to perceive-process-act quickly enough in a constant basis is to shift control towards fast software systems even in places where you'd rather not, and it often takes a very different conceptual vocabulary and understanding of the world (high-frequency trading, for example, isn't just faster day trading). The advantage, of course, is that you can successfully take advantage of environments that are considered too unpredictable for software and mathematics, and are therefore left in hands of human intuition at human time scales.

And that's when things get exponentially wrong.

None

None