In a recent report by cloud security firm Wiz, it has come to light that Microsoft’s AI research division inadvertently exposed a staggering 38 terabytes of sensitive data due to a misconfiguration involving Shared Access Signature (SAS) tokens.
The incident, which began in July 2020 and remained undetected for almost three years, originated from Microsoft’s attempt to share open-source code and AI models for image recognition via a GitHub repository.