This new tool lets artists 'poison' their artwork to deter AI companies from using it to train their models—here’s how it works

  • 📰 CNBC
  • ⏱ Reading Time:
  • 54 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 25%
  • Publisher: 72%

المملكة العربية السعودية أخبار أخبار

المملكة العربية السعودية أحدث الأخبار,المملكة العربية السعودية عناوين

If an artist doesn't want their pieces being used to train AI image generators, a new tool called Nightshade would let them 'poison' their artwork.

Artists who want to share their artwork often face a tough choice: keep it offline or post it on social media and risk having itBut a new tool may soon be able to help artists deter AI companies from using their artwork without permission.

If enough of these "poisoned" images are scraped from the web and used to train an AI image generator, the AI model itself may no longer be able to produce accurate images., an AI image generator, and an AI model they created themselves 50 "poisoned" images of dogs, then asked it to generate new pictures of dogs. The generated images featured animals with too many limbs or cartoonish faces that only somewhat resembled a dog, per MIT Technology Review.

But it's not magic helping these generative AI models create realistic looking images of a pink giraffe or an underwater castle — it's training data, and lots of it. A lot of the data used to train many generative AI systems is scraped from the web. Although it's legal in the U.S. for companies to collect data from publicly accessible websites and use it for various purposes, that gets complicated when it comes to works of art since artists typically own the copyright for their pieces and sometimes don't want their art being used to train an AI model.

 

شكرًا لك على تعليقك. سيتم نشر تعليقك بعد مراجعته.
لقد قمنا بتلخيص هذا الخبر حتى تتمكن من قراءته بسرعة. إذا كنت مهتمًا بالأخبار، يمكنك قراءة النص الكامل هنا. اقرأ أكثر:

 /  🏆 12. in SA

المملكة العربية السعودية أحدث الأخبار, المملكة العربية السعودية عناوين