Epiphenomena

Enlightening post from Jason Pargin

The story is interesting in its own right. Youtube observs responses from users, both to videos being listed in their screens, and to actually watching the videos, runs some Machine Learning models1 over that feedback information, and selects what to list to them next to keep them watching and engaging. (This is widely understood)

(In a tiny, tiny fraction of high-profile cases, it then applies human moderation to advance the company’s interests, its political and social biases, and so on. That’s not what I’m writing about today)

As is known, this feedback loop can lead people in some highly unexpected directions. Recreational lock-picking, really? There are also some less mysterious tendencies — any activity is more watchable if it’s being done by attractive young women. But the particular instance Pargin finds — of an innocuous third-world fishing video getting ten times the views if it mildly hints at a tiny bit of indecency that isn’t even really there — would have been very difficult to predict. Note that it’s not as simple as “ten times as many people want to see the videos with the not-quite-upskirt thumnail”. Because of the feedback, more people get the suggestion to watch that video, and many of them might have equally watched the other ones too, but didn’t get the opportunity. The behaviour of a smaller number of unambitious creeps is driving the behaviour of a (probably) larger number of ordinary viewers.

Pargin makes the wider point that this same system of user feedback and ML-generated recommendation is shaping the content across all digital media. Whatever you have to do to get the views, those are the rules, even though nobody chose those rules, even though nobody knows what all the rules are, if you are in the business you just have to do your best to learn them.

I want to make a wider point still. We can understand, roughly, how this particular mode of media comes to produce some kinds of content and not others. That does not mean that without this particular mode of media, you get “normal, natural” kinds of content. You just get different incentives on producers, and consequently different content.

It’s not just media, either. Different structures of organisation and information flow produce different incentives for participants, and consequently different behaviour. Financing a business by selling equity into a highly liquid public market produces certain specific behaviours in management. Running a sport where teams can prevent players moving between them produces certain behaviours in the players. Organisations may be designed to incentivise certain desired behaviours, but many others will arise spontaneously because the system as a whole unexpectedly rewards them.

This is what Moldbug means when he says “The ideology of the oligarchy is an epiphenomenon of its organic structure.” We do not have woke ideology because a deep centuries-long woke conspiracy has taken over. We do not have it because someone sat down and worked out that a particular structural relationship between civil service, universities, and television would tend to promote ideological shifts of particular kinds. We have it because a structural relationship was created between civil service, universities, and newspapers and it turns out that that structural relationship just happens to result in this kind of insanity. You can trace through all the details — the career path of academics, the social environment of civil servants. You can spot historical parallels — this bit Chris Arnade found on pre-revolutionary French intellectuals. Moldbug attributes this epiphenomenon primarily to the separation of power from responsibility. I’m sure he’s right, but it’s a bit like Jason Pargin saying “yes, the internet really is that horny”. The particular ways in which irresponsibility or horniness express themselves in systems are still somewhat unexpected.

Related:

  1. which are not algorithms