Site icon Tech Newsday

Microsoft AI researchers accidentally leak 38TB of data

Microsoft AI researchers accidentally leaked 38TB of sensitive data, including backups of personal information belonging to Microsoft employees. This include passwords for Microsoft services, secret keys, and an archive of over 30,000 internal Microsoft Teams messages originating from 359 Microsoft employees.

The data leak was discovered by cloud security firm Wiz, whose security researchers found that a Microsoft employee inadvertently shared the URL for a misconfigured Azure Blob storage bucket containing the leaked information.

Microsoft linked the data exposure to using an excessively permissive Shared Access Signature (SAS) token, which allowed full control over the shared files. This Azure feature enables data sharing in a manner described by Wiz researchers as challenging to monitor and revoke.

Microsoft said that no customer data was exposed, and no other internal services faced jeopardy due to this incident. Wiz reported the incident to Microsoft on June 22nd, 2023, which revoked the SAS token to block all external access to the Azure storage account, mitigating the issue on June 24th, 2023.

The sources for this piece include articles in TheVerge and BleepingComputer.

Exit mobile version