Microsoft leaks 38TB of private data via unsecured Azure storage::The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.

  • pavnilschanda@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    1 year ago

    This will definitely make customers less trustful of Microsoft when dealing with their privacy-focused AI projects. Here’s to hoping that open-source LLMs become more advanced and optimized.

    • Tatters@feddit.uk
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      I am not sure. This was mostly a case of human error in not properly securing urls/storage accounts. The lack of centralised control of SAS tokens that the article highlights was a contributing factor, but not the root cause, which was human error.

      If I leave my front door unlocked and someone walks in and robs my house, who is to blame? Me, for not locking the door? Or the house builder, for not providing a sensor so I can remotely check whether the door is locked?