whose work I have covered here before
, though, Ovadya argues that red-teaming alone isn’t sufficient. It’s not enough to know what material the model spits out, he writes. We also need to know what effect the model’s release might have on society at large.
If adopted by companies like OpenAI and Google, either voluntarily or at the insistence of a new federal agency, violet teaming could better prepare us for how more powerful models will affect the world around us. Either way, the next several months will let us observe the real-world effects of GPT-4 and its rivals, and help us understand how and where we should act. But the knowledge that no larger models will be released during that time would, I think, give comfort to those who believe AI could be as harmful as some believe.
no, speed up
Because the Chinese need to get ahead?
España Últimas Noticias, España Titulares
Similar News:También puedes leer noticias similares a ésta que hemos recopilado de otras fuentes de noticias.
Fuente: CNN - 🏆 4. / 95 Leer más »