Microsoft’s AI research team accidentally exposed 38 terabytes of private data through a Shared Access Signature (SAS) link it published on a GitHub repository, according to a report by Wiz research that highlighted how CISOs can minimize the chances of this happening to them in the future.
Dubbed “robust-models-transfer,” the repository was meant to provide open-source code and AI models for image recognition, and the readers of the repository were provided a link to download the models from an Azure storage URL.