The expertise architecture of journalism, and how to use AI to kill it faster

(Or save it, if you want.)

A good way to think about the knowledge economy (a very problematic term, but that’s another post) is to look at the architecture of expertise: who or what knows who to do what, where they are, who controls them, etc. Details matter: humans and software aren’t fungible, form follows finance, and so on, but before we can map changes and possibilities we need to understand the underlying structure.

The expertise architecture of journalism, at least in its ideal Platonic form, looks a bit like this:

  • Informants, whistleblowers, and assorted “contacts” have access to confidential or so far unknown information.
  • Experts have deep knowledge about narrow but relevant fields, from the archetypal background forensic consultant to newspapers on-demand science commentators.
  • Investigative journalists receive initial hints and, through research and and analysis, assemble expertise and information into new knowledge: this is something happening you (the reader) didn’t know about, that is what you thought was happening but this is what’s really happening.
  • Writers and editors (in some cases the writer is the journalist; everything I’m saying here also includes photographers, etc) turn this knowledge into a story; what people will read, watch, or listen to.
  • A set of experts in everything from printing to web infrastructure make this content available to people
  • People read, listen to, or watch reporting, or, if a very loaded and disturbing term, “consume content”

This is still the assumed pattern, and it’s one that has shown over the last century or two important social positive externalities: if good journalism had no impact there’d be no censorship.

But when you look at it as a pure financial play, the standard journalistic expertise architecture has serious scalability problems: journalistic analysis is a very slow, expensive, and inefficient way to “generate content”: you can create a listicle in a tiny fraction of the time and cost of a reasonable report on any issue, or even of simply a gathering of quotes and some generic comments, and it’s not clear that it’ll have a worse financial return.

So in the infinite wisdom of the efficient market, expertise architecture has moved towards

Not everywhere and for everything — we still have expert investigative journalists and deeply researched and important stories — but the tendency leans towards ever-lighter newsrooms.

And as nothing says “scalable business model” like social media (not entirely true, but that’s, yes, another post), contemporary journalist as a business tries to look like

Here “media” is simply “where you put your ads and trackers.”

You can see the problem right away. Not everybody can write well, not even remotely, but everybody can post, and it’s not as if social media favors the long form anyway, so:

  • If experts and informants can post directly to Twitter, Medium, Substack, etc…
  • … and your journalistic work is mostly aggregation…
  • … sometimes of things first posted on social media…
  • … why go through it?

This isn’t to say, again, that journalists aren’t adding value, or that a lot of what ends up in social media would have gotten there without a journalist doing hard, specialized, and sometimes dangerous work. But this is hard, specialized, sometimes dangerous work that’s, increasingly, badly paid and supported, because the less specialized value is added to content by the journalistic analysis part of the process, the worse it can compete with amateur, disingenuous, or direct publication, and the less resources end up allocated to it; this is a vicious circle of cost-cutting-driven self-destruction that’s not unique to this industry, but it’s rather a result of mistaking the accounting map for the activity territory (yep, another post).

Then generative AI come along, and media owners all over the world rejoiced and innovated:

It didn’t work. It’s probably worth it to be annoyingly explicit about why:

Content generation was never the competitive advantage and value of journalism: therefore, throwing technology at that part of the process could not, did not, and will not help journalism.

Let’s put on our optimism hats — I do have one — and turn the proposition around:

The specific expertise of a certain form of analysis is the competitive advantage of journalism: therefore, the more you strengthen that, the less vulnerable you are to race-to-the-bottom content generation and traffic maximization economics.

To be clear, this isn’t, and there isn’t, a recipe to turn journalism into the sort of infinitely scalable business model that attracts 100x-or-bust VC capital. It’s always going to be slow and expensive, and it’s never going to have explosive growth possibilities. But granted that, it can be sustainable, even under current pressures.

To see how to use technology to help this, let’s go back to the original expertise architecture:

Understanding what makes journalism both useful and viable, it’s clear where we need to use AI: to improve journalistic analysis itself. Everywhere else is fixing the wrong problem, or at best not helping the main one. Luckily, though, AI is just as good helping build deeper and more scalable analytical expertise as it is helping generate content! Even more so, to be honest, although this is less known because building and testing expert models about economics, politics, etc, is less impressive, or less viral, than putting in a prompt and getting a photo.

This doesn’t mean that journalists have to become data analysts — and data analysis is just an aspect of AI building, just as data analysis is just an aspect of journalism — but rather that, when you zoom out:

  • A large part of expertise nowadays consists on leveraging your specific know-how plus other people’s data and AIs plus other people’s AI-building expertise to develop your own expert AIs; ask software engineers, astrophysicists, or video professionals.
  • Journalism is a form of expertise.
  • So.

What is the specific expertise of journalists? If you’re reading this chances are you know this better than I do, but at the very least it involves an understanding of hidden relationships — the new law and the recent political donation, the economic trend and the ecological cost — and sound criteria of relevance and proof. Just like epidemiologists and political scientists, journalists know things about the world, and ways to find things about the world, that other disciplines don’t. Much of this, in fact, consists of knowing which experts to call upon about what and how to put the pieces together.

This is something you can build software to help you do better and faster. Not just at individual newsrooms, but as an industry: insofar as we can put expertise right now not just inside human minds and texts but also in software, “asking an expert” might also be “interfacing with an expert’s systems.” Why not have industry-wide

  • “who said what when” databases?
  • networks of known or suspected assets from people of public interest?
  • expert comment or “grading” of new scientific reports?

Not all of these are sources of competitive advantage between media companies. For example, most media reporting on new developments in science and technology is downright bad: you can tell that the writer didn’t have the training or, most likely, the time to even do a cursory reading of the paper (or perhaps the editorial mandate was for traffic over accuracy, but that’s a different sort of problem). It would be straightforward to set up an industry-wide common resource, or perhaps link with or create NGOs and other civil society organizations, to do this much better and with little or no added cost.

And these are the silly, conservative ideas! Paleontology is much more, and much more sophisticated, then catalogs of fossils, and political science has more complex models than voting records. I can’t describe how AI versions of specific journalistic analysis expertise will look like because they haven’t been built yet, or at least haven’t been widely reported on yet (ironically).

Under the hood, the mathematics might look somewhat like the approach in this this paper, which describes a simple way to augment a classic optimization algorithm with expert input to improve lithium-ion battery design. Chemistry isn’t journalism, of course, but the future of expertise is necessarily going to be, or rather, the cutting-edge of expertise already is, mixed teams of AIs and humans. Under the twin pressures of epistemic chaos in social networks and punishing financial strategies, the only way forward for journalism as a socially relevant body of expertise is through increased sophistication, not lowered costs.

This form of federated expertise architecture is, I think, a possible future not just for journalism but also for civil society, and journalism can help catalyze it. To go from this

to this

This is more complex because it’s an augmentation, not a replacement. It adds more players — AIs — in the right place.

It’s not journalism as gatekeepers of expertise — that’s no longer viable — but rather journalism as providers of specific expertise. If the industry can recognize the uniqueness of its know-how, and instead of de-invest on it and invest on new technology elsewhere dedicate technological resources to deepening it, then it might not just save itself and its irreplaceable role in society, but also spearhead, and not for the first time, wider positive changes.

If I could describe exactly how that looks like it wouldn’t be innovation. But the tools are there, the problem is clear, and the stakes are as high as they can be.