AIs as societal zero-day exploits

2022-12-20

The short-term stability of any system — a computer network, a market position, a political configuration — depends crucially on a game-theoretical calculus of difficulty: a position is considered secure if attacking it requires more resources (money, technology, people, time) than opponents have access to or would make sense to deploy.

AIs are obviously a new move in this game. They change, sometimes dramatically, what can be done with a certain level of resources. New tradeoffs between people, skill, time, and money are possible, and companies, governments, and other organizations are either applying or suffering these new tools, with the corresponding changes in the relative power (including, very often, a deepening of existing inequalities).

This is a lethal underestimation of the usefulness of AIs.

Recall that political and economic competition might be partially modeled as a game in the mathematical sense, but if so it's one characterized not by the enormous complexity of playing it but by the enormous complexity of figuring out what the possible moves are. Essentially, the use of AI as a technology has less of a meta-strategic impact than the use of AI to understand the space of possible technologies. I'm using here "technology" in the broadest possible sense, and perhaps even this is too narrow. Better to put it this way: using a military metaphor, we use AIs to build weapons, not to figure out strategies, and most important wars have been won by the right strategy (often "right" in the sociopolitical sense; you don't need to win any battle to win a war).

Or in yet another formulation: using AI to design products is more powerful than using AI as a product, and using AI to figure out what products you need to build is even more powerful than using it to design them.

A conventional narrative for the impact of AI pictures widespread automation and anthropomorphic agents. A more interesting one, which I wholeheartedly recommend to your consideration, points to a qualitative speedup and increased complexity in every form of process and design; not self-driven cars but cars that couldn't be designed or produced without state-of-the-art AIs.

There's a third impact of AI harder to spot and even harder to sell, but (and therefore) immensely more powerful: a change not in the tools but in the way the game is understood. Victories coming not from better technology but from a deeper, richer understanding of what's going on. Deploying AIs guided by the strategic understanding of humans aided by dashboards is precisely the wrong way to look at both.

I've said this before, and it's getting clearer by the week: you will know a competitor is qualitatively outthinking you not when you can't replicate their products but when, even with the benefit of hindsight, you can't replicate the thought process that led to them.

None

None