According to Wiz, the mistake was made when Microsoft AI researchers were attempting to publish a "bucket of open-source training material" and "AI models for image recognition" to the developer platform.
The researchers miswrote the files' accompanying SAS token, or the storage URL that establishes file permissions. Basically, instead of granting GitHub users access to the downloadable AI material specifically, the butchered token allowed general access to the entire storage account. And we're not just talking read-only permissions. The mistake actually granted "full control" access, meaning that anyone who might have wanted to tinker with the many terabytes of data — including that of the AI training material and AI models included in the pile — would have been able to.
An "attacker could have injected malicious code into all the AI models in this storage account," Wiz's researchers write, "and every user who trusts Microsoft’s GitHub repository would've been infected by it.", meaning that this sensitive material has basically been open-season for several years.
Malaysia Malaysia Latest News, Malaysia Malaysia Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: verge - 🏆 94. / 67 Read more »