The dislike button could soon make its arrival on TikTok. More specifically, it would be present in the comments, in order to affirm disagreement with the remarks that can be found under a video. On the other hand, no counter is to be expected, and no one can know who dislikes what. It is above all a question of reporting spam and abusive language to help moderate them more quickly.
The dislike button poses a thorny problem for social networks. On the one hand, it allows you to give your opinion in the same way as the more traditional like, in addition to warning other users of the potential poor quality of the content of the person concerned. On the other, it can be the source of harassment campaigns and harmful to the mental health of content creators. These are the reasons why YouTube decided to remove it from its platform last November.
Not content with usually drawing inspiration from its spiritual big brother, TikTok has decided this time to go in the opposite direction. Indeed, in a recent blog post, the social network announced that a handful of users can already downvote comments. However, this dislike button is nothing like what we already know. First, it is, as stated, only present in the comments section. It is therefore impossible to use it directly on the video.
TikTok is testing a dislike button in comments to help with moderation
Secondly, no counters will appear on the screen of the users, whether it is the author of the video or those who view it. Only those who have disliked a comment will be aware of its action. In other words, this button will only serve as a report for the moderation teams. The more dislikes a comment receives, the more likely it will be deleted. However, TikTok clarifies that it will not replace the traditional method of reporting.
“This community feedback will add to the set of factors we already use to ensure that the comments section is always relevant and a place for authentic engagement,” explains the Chinese social network. A new method that adds to the efforts already made by the platform to make its comments space healthier. The feature is under testing around the world.
To make it easier for our community to find and use the built-in safety tools we offer; we’re experimenting with reminders that will guide creators to our comment filtering and bulk block and delete options. The reminders will appear to creators whose videos appear to be receiving a high proportion of negative comments. We will continue to remove comments that violate our Community Guidelines, and creators can continue to report comments or accounts individually or in bulk for us to review.