The AI CFO

2023-02-01

Finance, like its cousin Poetry, is the use of precise language to talk about what we don't, and can't, know. Every non-trivial financial situation is riddled by unknowns: How much of what I'm owed will I be paid, and when? What will I sell, made at what cost? How much would it take me to get a loan in what conditions? How will the world change? How much will my bag of assets and liabilities be worth tomorrow, and what will I know or ignore tomorrow about next week?

Much, perhaps most of financial regulation and practice consists of deliberate, sometimes performative acknowledgment of these deeps uncertainties, rules to make it harder to bet in Fate's casino with other people's money, and rough guidelines to write down specific numbers when neither honest shrugging nor confident precision would do. Even in the closed meeting rooms of most corporations — where strategy is woven and unwoven as quarters come and go — a handful of carefully chosen and boldly developed scenarios are made to stand for the whole multidimensional space of possible futures and their financial implications.

The point to keep in mind is that this way of thinking about finance is a legacy bug inherited from the limitations of paper, presentation slide, and human attention. We slice the complex joint probability distribution of unknowns — interest rates, sales, costs, wages, project timelines, and so on — into one or a few scenarios because that's what law has been written around, markets expect to read, and managers have been trained around, but that's neither the natural way of thinking about finance nor necessary anymore.

To be precise:

That's an expensive Cognitive CapEx upgrade, even undertaken at a prototype scale (as advice: it's always better to fully upgrade a very small organization than to do vague things to larger ones; the impact comes from the network effect). It's natural to ask whether the ROI is worth it, for the company in general and specifically for the CFO as individual.

The answer, of course, is that the competitiveness of unaided individual skill is depreciating by the week, and that the easier software becomes to talk with (often in a literal sense) the cheaper it is to replace what you might call human-level financial skill, and therefore the more critical it will be to be able to take advantage of superhuman-level one.

I don't know what a highly-paid translator will look like five years from now and what they will be doing; I know they will be doing it with AIs. Finance won't demand a less significant transformation, but it will likely offer larger rewards.

None

None

None

None

None