A misconfigured link enabled public access to 38TB of Microsoft's confidential data from two employees' workstations, opening up the potential for injecting malicious code into Microsoft's AI models.
The tech giant claimed a URL link contained an ‘overly-permissive’ token, which gave access to private employee data. Microsoft’s AI researchers accidentally…
Microsoft Corp (NASDAQ: MSFT) recently addressed a security incident involving a Microsoft employee who inadvertently shared a URL with an overly permissive Shared Access Signature (SAS) token in a public GitHub repository.
Wiz Research found a data exposure incident on Microsoft’s AI GitHub repository, including over 30,000 internal Microsoft Teams messages – all caused by one misconfigured SAS token