Google to hire thousands of moderators after outcry over YouTube abuse videos
The article is about YouTube facing problems in their search engines which were bringing up inappropriate child abuse videos and other violent and offensive content on YouTube.
- YouTube’s owner announced on Monday-total workforce to more than 10,000 people responsible for reviewing content that could violate its policies.
- “Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualised decisions on content,”- YouTube’s CEO, Susan Wojcicki
- YouTube has also repeatedly sparked outrage for its role in perpetuating misinformation and harassing videos in the wake of mass shootings and other national tragedies.
Youtube last month faced scrutiny from their users and gained reports for allowing violent and abuse videos past the Kids filter, hich is supposed to block any content that is not appropriate to young users. According to New York Times, Parents found that YouTube was allowing children the ability to view videos with known characters in violent or lewd scenarios, along with nursery rhymes mixed with disturbing imagery.
Now, YouTube is using machine learning technology to help human moderators find and shut down hundreds of accounts and hundreds of thousands of comments, according to Wojcicki aimed to stop this behaviour.
This article shows that though large organisations such as YouTube are able to allow this to pass them, a concern but also shows that in terms (for) the moral good they have chosen to not allow this content to hive into something bigger by using tech to fight this. However, the problem still stands that Youtube allowed this past their coded algorithms and weren't able to decipher the problem till audiences themselves started noticing.
No comments:
Post a Comment