To create a less toxic environment, YouTube will launch a feature that will alert users before posting possible offensive comments. According to the platform, the intention is “to give an option for the person to reflect before sending the message”.
Available on certain occasions, the notification will be based on a system of content that has been reported repeatedly. However, this will not prevent people from sending the message even after the alert.
Initially, the alert will be available on Android apps.Source: Pexels.com/Reproduction
In principle, notification will be available for apps on Android devices. In the meantime, the analysis will be done only in English texts. After initial testing, the tool must be released in other languages.
In addition to the new feature, YouTube Studio’s content filtering system is being enhanced. Soon, creators will be able to use the manager to find inappropriate comments that have been automatically flagged by the platform.
This way, the creator will be able to moderate the messages and remove them from the comments area so that others do not read the inappropriate material. A simple way to maintain a healthier environment for everyone.
The current hate speech policy was implemented by YouTube in 2019.Source: YouTube / Playback
An intense fight against hatred
Offensive comments really are one of the biggest problems facing YouTube and the creators. However, there is still a long fight against hate speech by users.
According to data provided by the platform, it has removed 46 times more daily hate speech comments since the beginning of 2019. In addition, some 54,000 channels that propagated inappropriate content were closed in the last quarter.
This was the largest number of accounts closed due to hate speech in a single quarter. Surprisingly, it is three times larger than when YouTube rigorously enforced new behavioral policies in early 2019.