The new voluntary agreements include commitments to external, third-party testing prior to releasing an AI product and the development of watermarking systems to inform the public when a piece of audio or video material was generated using AI systems. The Biden administration said these voluntary commitments, which essentially amount to self-policing by tech firms, mark just the first of several steps needed to properly manage AI risk.
For security, the companies are committing to invest in additional cybersecurity and insider threat safeguards and agree to support third parties in discovering and reporting vulnerabilities to their systems. Perhaps most interestingly, the tech firms say they will all develop technical mechanisms like watermarks to ensure users know when content is AI-generated. A White House official speaking on the phone said these commitments were intended to push back against the threat of deepfakes and build trust among the public. Similarly, the AI-makers haveto prioritize research showing the risks of bias, privacy, and discrimination their products can pose.
The White House official speaking with Gizmodo and other reporters said the commitments were intended to bring each of these seven companies together under the same set of agreements. And while this does not officially affect the hundreds of other smaller companies working on AI systems, the White House hopes the baseline set here could encourage others in the industry to follow a similar path.
Canada Canada Latest News, Canada Canada Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: KPRC2 - 🏆 80. / 68 Read more »
Source: washingtonpost - 🏆 95. / 72 Read more »
Source: thedailybeast - 🏆 307. / 63 Read more »
8,000 authors demand compensation from AI companies for using their worksJames Patterson, Margaret Atwood, and 8,000 other authors want AI companies to pay them for using their works
Source: BusinessInsider - 🏆 729. / 51 Read more »
Source: NBCPhiladelphia - 🏆 569. / 51 Read more »