- cross-posted to:
- sysadmin@lemmy.world
- cross-posted to:
- sysadmin@lemmy.world
Microsoft leaks 38TB of private data via unsecured Azure storage::The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
This will definitely make customers less trustful of Microsoft when dealing with their privacy-focused AI projects. Here’s to hoping that open-source LLMs become more advanced and optimized.
I am not sure. This was mostly a case of human error in not properly securing urls/storage accounts. The lack of centralised control of SAS tokens that the article highlights was a contributing factor, but not the root cause, which was human error.
If I leave my front door unlocked and someone walks in and robs my house, who is to blame? Me, for not locking the door? Or the house builder, for not providing a sensor so I can remotely check whether the door is locked?
Azure has a huge problem with SAS tokens. The mechanism is so bad, that it invites situations like this.
Removed by mod
if you live in an apartment and the landlord doesnt replace the front door locks when they break is a better analogy