There are only two emotions in Facebook, and we only use one at a time

2017-10-29

We have the possibility of infinite emotional nuance, but Facebook doesn't seem to be the place for it. The data and psychology of how we react emotionally online are fascinating, but the social implications, although not specific to social networks, are rather worrisome.

A good way to explore our emotional reaction to Facebook news is through Patrick Martinchek's data set of four million posts from mainstream media during the period 2012-2016. I focused on news posts during 2016, most (93%) of which had received one or more of the emotional reactions in Facebook's algorithmic vocabulary: angry, love, sad, thankful, wow, and, of course, like.

In theory, an article could evoke any combination of emotions — make some people sad, others thankful, others a bit angry, and yet in others call for a simple "wow" — but it turns out that our collective emotional range is more limited. Applying to the data a method called Principal Component Analysis, we see that we can predict most of the emotional reactions to an article as a combination of two "hidden knobs":

And that's it. Thankfulness, likes, even that feeling of "wow," are distributed pretty much at random through our reaction to news. What makes one article different to another to our eyes (or, more poetically, to our hearts) are something that makes us love them, and something else that makes us, with equal strength or probability, feel angry or sad about them.

Despite their names, it's not logically necessary for the "strength" of love to be low when anger/sadness is high, or vice versa. Remember that they measure the frequency of different emotional responses; it's easy to imagine news that half of its readers will love, yet will make the other half angry or sad.

Remarkably, that's not the case:

The graph shows how many news posts, relatively speaking, show different combinations of strength in the (horizontal) love and (vertical) angry/sad dimensions (click on the graph to expand it). Aside from a small group of posts that have zero strength in either dimension, and another, smaller group of more anomalous posts, most posts lie in a straight line between the poles of love and angry/sad: the stronger the love dimension of a post, the weaker will be its angry/sad dimension, and vice versa.

Different people have different, often opposite reactions to the same event. Why is our emotional reaction to news about them so relatively homogeneous? The answer is likely to be audience segmentation: each news post is seen by a rather homogeneous readership (that media source's target audience), so their reaction to the article will also be homogeneous.

In other words, a possible indicator that people with different preferences and values do read different media (and/or are shown different media posts by Facebook) is that the reactions to each post, either love of its statistical opposite, are statistically more homogeneous than they'd otherwise be. If everybody at a sports game are either cheering or booing at the same time, you can tell only one group of fans is watching it.

It's common, but somewhat disingenuous, to blame the use of recommendation algorithms for this. As soon as there are two TV stations in an area or two newspapers in a city, they have always tended to get each their own audience, and shape themselves to their interests as much as they influence them. The fault, such as it is, lies not in our code, but in ourselves.

Two things make algorithmic online media in general, and social networks in particular, different. First, while resistant to certain classic forms of manipulation and pressure (e.g. censure by phone call to TV network owner, except in places like China, where censorship mechanisms are explicitly built in both technology and regulations) they are vulnerable to new ones (content farms, bots, etc).

Second — and this is at the root of the current political kerfuffle around social networks — they need not be. Algorithmic recommendation is increasingly flexible and powerful; while it's unrealistic to require things like "no extremist content online, ever," the dynamics of what gets recommended and why can and are continuously modified and tweaked. There's a flexibility to how Facebook, Twitter, or Google work and could work that newspapers don't have, simply because networked computers are infinitely more programmable than printing presses and pages of paper.

This puts them in a bind that would deserve sympathy if they weren't among the most valuable and influential companies in the world, and utterly devoid of any sort of instinct for public service until their bottom line is threatened: whatever they do and not do risks backlash, and there's no legal, political, or social agreement as to what they should do. It's straightforward to say that they should censor extremist content and provide balanced information about controversial issues — in a way, we're asking them to fix not bugs in their algorithms, but in our own instincts and habits — but there are profound divisions in many societies about what counts as extremism and what's controversial. To focus on the US, when first-line universities sometimes consider white supremacism a legitimate political position, and government officials in charge of environmental issues consider the current global scientific consensus on climatology a very undecided matter, there's no politically safe algorithmic way to de-bias content... and no politically safe way to just wash your hands off the problem.

Social networks aren't powerful just because of how many people they reach, and how much, fast, and far they can amplify what they say. They are are unprecedentedly powerful because they have an almost infinite flexibility on what they can show to whom, and how, and new capabilities can always unsettle the balance of power. Everywhere, from China to the US to the most remote corners of the developing world, we're in the sometimes violent process of re-calculating how this new balance will look like.

"Algorithms" might be the new factor here, but it's human politics what's really at stake.