were reported to the National Center for Missing and Exploited Children.
The tool also uses machine learning to proactively search for suspected child sexual exploitation material that hasn’t already been reviewed by law enforcement and flag it for human review. If the human content reviewer agrees that the image has been flagged correctly, the reviewer can report, delete and prevent anyone else from uploading it.
“People are really afraid to talk about it, but this is a problem that all tech companies are facing,” said Sarah Schaaf, co-founder of Imgur. “We all know this is an issue worth combatting but, when you are a smaller or mid-sized company, it’s tough. It requires a large financial investment and experts who know what to do.”
Belgique Dernières Nouvelles, Belgique Actualités
Similar News:Vous pouvez également lire des articles d'actualité similaires à celui-ci que nous avons collectés auprès d'autres sources d'information.
La source: CNBC - 🏆 12. / 72 Lire la suite »