whose work I have covered here before
, though, Ovadya argues that red-teaming alone isn’t sufficient. It’s not enough to know what material the model spits out, he writes. We also need to know what effect the model’s release might have on society at large.
If adopted by companies like OpenAI and Google, either voluntarily or at the insistence of a new federal agency, violet teaming could better prepare us for how more powerful models will affect the world around us. Either way, the next several months will let us observe the real-world effects of GPT-4 and its rivals, and help us understand how and where we should act. But the knowledge that no larger models will be released during that time would, I think, give comfort to those who believe AI could be as harmful as some believe.
no, speed up
Because the Chinese need to get ahead?
Belgique Dernières Nouvelles, Belgique Actualités
Similar News:Vous pouvez également lire des articles d'actualité similaires à celui-ci que nous avons collectés auprès d'autres sources d'information.
La source: CNN - 🏆 4. / 95 Lire la suite »