The video platform YouTube presented on Tuesday (6) a new index that shows the effectiveness of the filters for suspending and removing videos “that violate company guidelines”. The metric is called the Content Viewing Rate that Violates Policies (VVR).
In a nutshell, the number indicates what percentage of views on YouTube are of videos considered inappropriate – a category that ranges from misinformation and hate speech to copyright issues. It will be released for each quarterly transparency report released by the platform.
Currently, the value is between 0.16% and 0.18% – which means that for every 10,000 views, 16 to 18 are videos that violate YouTube policies. Although it still looks high, the result is 70% lower compared to the same quarter of 2017.
What do I get out of it?
VVR is estimated by YouTube data scientists and will help the company determine even what type of video is most harmful and what areas need improvement. It will be used in conjunction with existing metrics, such as the time to remove an inappropriate video and the number of reports received.
The calculation also serves to prove that investing in filtering and identification technologies brings results, as the percentage tends to decrease with the application of machine learning and artificial intelligence on the platform.
YouTube has been conducting these assessments internally since 2017 to “measure the company’s responsibility work”, that is, how much it was responsible for hosting irregular content. In recent years, the platform has been accused of failing to combat misinformation on various topics, such as earthworks, climate change denials and conspiracy theories related to the covid-19 pandemic.