gives context, shape, and direction to complex questions regarding information

Last Sunday, Arjen Lubach tried to get someone from Facebook on his television programme, “Zondag met Lubach” (“Sunday with Lubach”). He wanted them to take responsibility for the choices they make regarding publishing and censoring news stories. The comparison Lubach draws between that and newspapers, however, does not hold up. Facebook makes a judgement error in their response as well.

Why is Arjen Lubach wrong?

Arjen also asked the editors in chief of several large newspapers to defend their selection of news stories. Almost all of them were willing to do so.

They were not only willing, but also able, because they defend the choices and the work of their editorial staff. Those are responsible for the content of their media outlet; they decide what “news” is.

Facebook has a different setup. There is no massive team of editors to publish news. All content is created by the readers, not by the platform itself. On Facebook, “news” isn’t defined by a centralised redaction, but by which messages are clicked, linked and liked more than others. The determination of news value is, more or less, a democratic process.

Presumably, this is what Facebook’s spokesperson meant when she said they didn’t have an editor in chief (“oompa-loompa”??) to take responsibility. According to her, the news is generated by an algorithm.

An organisation is able to (and required to) take responsibility for their choices, as long as they make those choices themselves. For newspapers and other outlets, that’s the redaction’s job, and so the editor in chief can take responsibility. The choices made on Facebook are made by the users. This means that Facebook, as a platform, can’t be held accountable for the quality of the news.

This reasoning puts them in a very convenient position: almost anything can happen on their platform, and they can’t be blamed for it.

But that’s a little too easy for Facebook.

Why is Facebook wrong too?

Even though they publish almost nothing themselves, they certainly influence what gets read and what doesn’t. They do this in at least two ways:

The algorithm

Facebook has an algorithm that determines how many clicks, links, and likes it takes to put a certain post in the spotlight – to make it news. Of course, this algorithm is Facebook’s responsibility. Even if it wasn’t an algorithm based on solid criteria, but on something like artificial intelligence, they would still be accountable for it. They are not only responsible for the fact that it works, but also for the way it works.


Because Facebook has community guidelines that dictate what can and can’t be published, Facebook influences its content. Making rules in the first place is something they can be held accountable for.

Enforcing these rules is another thing that requires some scrutiny. That can either come from another algorithm, or from a team of human beings – in both cases, it is Facebook’s responsibility.

A different role for redaction

A redaction needs to be able to take responsibility for its actions. That means the choices they make themselves. For newspapers and other outlets, that means the things they produce, publish, and broadcast; for Facebook, it means the filters they apply to give more or less attention to certain publications, or even to delete them altogether.

In short: Lubach’s comparison doesn’t hold up, and neither does Facebook’s response.