The world's leading AI companies pledge to protect the safety of children online

  • 📰 engadget
  • ⏱ Reading Time:
  • 56 sec. here
  • 11 min. at publisher
  • 📊 Quality Score:
  • News: 57%
  • Publisher: 63%

Child Sexual Abuse Nouvelles

Hosting Companies,Artificial Intelligence Companies,CSAM

Pranav is a senior editor at Engadget responsible for handling news coverage during west coast hours.

Leading artificial intelligence companies including OpenAI, Microsoft, Google, Meta and others have jointly pledged to prevent their AI tools from being used to exploit children and generate child sexual abuse material . The initiative was led by child-safety group Thorn and All Tech Is Human, a non-profit focused on responsible tech.

One of the recommendations, for instance, asks companies to choose data sets used to train AI models carefully and avoid ones only only containing instances of CSAM but also adult sexual content altogether because of generative AI’s propensity to combine the two concepts. Thorn is also asking social media platforms and search engines to remove links to websites and apps that let people “nudity” images of children, thus creating new AI-generated child sexual abuse material online.

“This project was intended to make abundantly clear that you don’t need to throw up your hands,” Thorn’s vice president of data science Rebecca Portnoff. “We want to be able to change the course of this technology to where the existing harms of this technology get cut off at the knees.” Some companies, Portnoff said, had already agreed to separate images, video and audio that involved children from data sets containing adult content to prevent their models from combining the two. Others also add watermarks to identify AI-generated content, but the method isn’t foolproof — watermarks and metadata can be

 

Merci pour votre commentaire. Votre commentaire sera publié après examen.
Nous avons résumé cette actualité afin que vous puissiez la lire rapidement. Si l'actualité vous intéresse, vous pouvez lire le texte intégral ici. Lire la suite:

 /  🏆 276. in BE

Belgique Dernières Nouvelles, Belgique Actualités