Facebook: Calls to Violence, Yes, but Just No Nipples


The controversy over the hate and violence fantasies in social networks shows that digital society must change.

It was an illuminating experiment, a clever provocation. On Sept. 8, journalist Joachim Dreykluft was enraged once again. He is annoyed with Facebook because of all the hate commentary in the face of the refugee crisis, the gassing and murder fantasies that remained despite complaints and protests.

Dreykuft writes an online commentary, and criticizes Facebook as “asocial and hypocritical,” linking the article with a thumbnail photo that shows the easily recognizable nipples of a woman. He knows that he is violating so-called community standards because the distinguishability of nipples in the universal platform is considered a cause to merit banning a post for inappropriateness.

And in fact, the picture was removed, his Facebook account was locked, and he received the message, “You recently posted something that violates Facebook guidelines.” It apparently doesn’t really violate the guidelines when one wishes death for the refugees, wants to transport them to gas chambers in Auschwitz, fantasizes about shooting them with one’s own hand, or likes the picture of little Aylan Kurdi found drowned on the beach of the Turkish city of Bodrum.

The Question of Responsibility

Comments like these remain to the dismay of many, even after the resulting complaints that need to be followed up for legal reasons. Obviously, Facebook has announced the formation of a task force after the debates last week and the intervention of politicians. One was assured that all evidence was taken very seriously. And yet, the question remains — despite all intentions — how can it be that such threats of violence are left standing in spite of multiple complaints, but a naked breast, or the picture of completely exposed buttocks (this is not a less rigorously executed guideline) is deleted and avenged with temporary Facebook banishment?

The first answer is that it could become uncomfortable and expensive for Facebook (which had an annual profit of almost $3 million in 2014) to increase the invisible, often-migrated-to-other-countries army of digital trash sorters. The second answer amounts to the fact there is, perhaps, a different rigorous understanding of freedom of expression in Germany and in the U.S. when it comes to what propaganda and prudery are permitted.

The third answer leads directly to the contradictory universe of argument of Facebook itself, because on the one hand, one tracks pictures of nipples and buttocks with some energy, but on the other hand, one strictly rejects any editorial responsibility. No, one does not want to intervene — thus Greg Marra, responsible for the news feed algorithm of Facebook and therefore the most powerful newsmaker in the world, told The New York Times, “We don’t want to have editorial judgment over the content that’s in your feed.”

One can sneer at such “technocratic posturing” (Evgeny Morozov ) as a somehow confusing dodging of responsibility. But hidden behind the recently experienced “Clash of the Codes,” the current wrestling over normative certainty in a pool of colliding values is the core issue of digital modernity, which is: Who is responsible for the quality of publishing behavior?

In another time, it appeared rather clear. There were powerful gatekeepers in the form of journalists, institutionally identifiable journalistic centers of power with enormous interpretational sovereignty that could decide what got published or what did not.

Three Worlds of Information

Today, everyone who has access to the Internet and consumes media lives in three worlds of information intertwined with one another; they are the result of variably transparent, more or less self-determined selection decisions.

Naturally, the world of mass media and classical journalism is, counter to rumors to the contrary, still very present, even if it is losing influence because of declining advertising revenues and changing user habits. The selection decisions that govern here are at least principally known because so-called news values have been researched, discussed and criticized for years.

On the other hand, there is the world of completely individual selection and publication decisions that is open to anyone in the digital age. One must and can decide for oneself what sources one trusts, what one posts, shares, and publicizes — and into which bubble of reality one possibly Googles.

Tunnel of Self-Affirmation

The third information world arises from the non-transparent selection decisions and publication standards employed by search engines and social networks. Often, algorithms govern here; secret recipes for the construction of reality that advance some news items and let others disappear.

As shown by online activist Eli Pariser, these mechanisms can lure the individual into a tunnel of self-affirmation and a filter bubble of self-interest that he at some point possibly considers an accurate portrayal of the outside world.

Thus seen, it is no longer only a matter of the ignorance of Facebook, but a matter of looking to the three powers of the emergence of publishing (the individual, classical journalism, and the digital monopolists) to continue the debate that has now emerged.

The goal would be the expansion of the journalistic zone of responsibility, the gradual change of digital society into an “editorial society” (Cordt Schnibben) that, alongside private enjoyment, orients itself to the guiding principles of educational information brokering.

Schools and Universities as Laboratories

That would mean that a normative understanding of the origin of publishing would need to be taught in colleges and universities to keep pace with the times. Schools and universities can see themselves as a laboratory, a sphere that is protected but characterized by the current media reality, where the mechanisms of publishing can be studied and put to the test.

Orienting ourselves in the direction of an editorial society, however, also means that the ridiculous excuses of those who do not like to delete utter hate and violence-inciting propaganda, although the complaints are piling up, are no longer politically and legally tolerable. And it essentially means that no one who posts and publishes today can act as if the quality of the published material is not his concern.

About this publication


Be the first to comment

Leave a Reply