A throwaway note on the rhizome of all evil

2022-08-05

Premature optimization is no longer the root of all evil - how many programmers dive deep to optimize their programs, much less do it too early instead of when forced to? A far more common sin these days is premature scalability, the building of infrastructure as if hordes of users throwing terabytes of data our way were lurking behind every release. By the time we're done with the platform, the data gathering, the event processing pipelines, we're left with little time and fewer resources to tackle whatever problem they were all meant to solve. Powerful pipelines moving data into questionable linear regressions driving secondary product features are endemic.

But the point of a rhizome is that everything's connected: premature optimization and premature scalability are both aspects of the same fascination. It used to be that computers were culturally defined by a mystique of speed (metaphors with the speed of light, the obsessive counting of MHz). Nowadays it's a mystique of scale, not even in definite numbers but in that vague nirvana of a "Cloud-scale service." For programmers (and their managers, and their managers' investors) being as driven by dreams as any other group of humans, it was natural to fall to the temptation of optimization then, and scalability engineering now, as it feels the natural, "true," important aspect of computing.

In this spirit of dialectic contradiction, I'd like to offer a mystique of superintelligence, the also false but less widespread idea that the natural, "true," important aspect of computing is superhuman performance in a task. If you fall for this mystique then you will, prematurely and unnecessarily, focus on making programs that, unable to run on anything more distributed than the cores in your laptop's CPU and taking an eternity of seconds to respond to any request, solve a task better than a human could.

Evil would come from this, too, in the form of software that exceeds requirements in some dimensions while falling short in others, but, being a newer form of error, it would have the advantage of benefiting from the spoils of our previous errors: we have so many tools and experience with speed and scalability that it's far easier to make a smart program scalable and fast than to make a fast, scalable architecture much smarter than it is.

Speed, scale, or anything else: the oldest and best tradition in computing isn't any particular dimension of development, but rather the self-indulgent proclivity to do the newer and harder thing first. Today that's not measured by the volume of data, but by the Elo rating of what you built to play on the board of the world.

None

None

None

None

None

None

None