YouTube – the streaming giant has had enough of all the toxic comments on its videos. To curb the negativity, it has decided to warn its user about the content and appropriateness of their comments. A pop-up will appear and warn the user. The pop up will contain the following message – “
Keep comments respectful. If you’re not sure whether your comment is appropriate, review our Community Guidelines.
The warning will pop up whenever a user posts something which has offensive, inappropriate, or vulgar comment. The new filter will scrutinize the comment that is being posted and will give a warning if necessary. The user will then have the choice of editing the comment, to make it appropriate, or, just post it as is. However, users will also have the option to report the warning, if a comment was wrongly flagged.
The mechanism will get better with time the company says. This algorithm will help the company stop the spread of hate and also identify the various ways in which creators on the platform are harassed. The feature is presently available on the YouTube app for android devices. It will become available for iOS devices soon.
The algorithm will not bar the user from posting the comment, it will only flag the comment, which is where it may falter. The algorithm aims to make YouTube a safe space for all, especially women and people of color.
Of course, this new algorithm might not be completely effective in making YouTube a safe space for all. There are concerns about its efficacy nevertheless it is a step in the right direction.
This move comes after other social media giants like Twitter, and Facebook took steps to curb the hate on their platforms. Twitter, for example, has started flagging manipulated media while FaceBook has also taken similar steps to fight the Anti-Vax movement and other hateful content. As time goes on, the need for such algorithms will increase to fight the spread of negativity.