By Christian Burger
Hate speech seems to be ubiquitous in social media and comment sections. Online conversations are vulnerable: One or two people drop a few dismissive comments and the debate blunders instantly. Even if 95% of the contributions are civilized, the troublemakers draw the attention to themselves. Readers and commenters focus on the inappropriate utterances and suddenly everyone has the impression of a failed online discussion.
Actually, what everyone sees in these situations is just a fraction of the conversation. It is mostly this destructive part which is visible – like the tip of an iceberg. All the other comments, all the constructive contributions vanish below the surface.

Many media organizations hire staff whose jobs are frequently titled “social media manager” or “community manager”. Community management can be defined as “all methods and activities involved in drafting, building up, leading, managing, practicing, guiding and optimizing virtual communities and their representations outside of the virtual world. Operational tasks and activities refer to the direct interactions with community members” (Source: Veröffentlichung der offiziellen Definition Community Management, BVCM, 2010, translation by Christian Burger).
Digital garbage collection
Due to a lack of resources and a lack of know-how, community management was – and in many media organizations still is – reduced to very few repetitive and stressful operational undertakings: Community managers have the task of keeping comment sections clean of harmful contributions. Thus, the main activity is to search for such content, to focus on aggressive and hateful voices and silence them: silencing by deleting or hiding comments and by banning users. In the best cases silencing also includes a moralizing undertone and explanations why certain behavior is not tolerated.
Unfortunately, the strategy of silencing has two major disadvantages: On the one hand it devotes a lot of energy to the uncivilized participants of online conversations. Mostly, these individuals seek and actually succeed in attracting the attention of the community managers. One deleted comment can already fulfill their desire for recognition. On the other hand there are notable technical limitations to banning users: In most cases it is quite easy to register a new account and rejoin the conversation hidden behind a new nickname.
Devoting attention to valuable community members
However, community management doesn’t have to limit itself to this type of digital garbage collection. It can be used to cope with unsatisfactory online conversations in more sophisticated ways. Operational tasks can include many activities targeted at constructive users: By using clever moderation it is possible to turn the iceberg upside down, to make all the 95% of prudent, creative and witty comments visible and let the hatred slump. If community managers devote most of their attention to the valuable members of a community and their contributions, it is likely that online civility prevails.
Moderation tasks, in a setting where constructive behavior leads to higher levels of acknowledgement, certainly include direct reactions. For example, community managers could simply reply to a relevant comment and let the user know that his opinion or statement is of interest to the media organization. Sometimes, they could also ask for further information or pose specific questions. If appropriate, moderators can let users know that their information has been passed on to editorial staff. In any case, if comments are used for further investigation or as an impulse for new stories, community managers should inform the respective users about it.
Positive reinforcement can also be realized by selecting and highlighting valuable contributions. Such comments could be made available to the readers as separate lists or filters called “editors’ or community managers’ picks”. Thoughtfully selected, excellent statements by users could precede the comment section. If readers are confronted with the best of the best at first sight, it is highly likely that they enjoy reading such debates much more and if they decide to participate actively, they might show a higher level of civility themselves.
Online hate speech is not as ubiquitous as it may seem. However, destructive comments surface easily and lead to frequent disruptions of online conversations. Community management can do a lot more about it than simply deleting inappropriate utterances. The most promising strategy is to devote a considerable amount of attention to valuable community members and reward them for their contributions by means of moderation.
Leave a comment