This video shows that Reddit refused to delete all comments and posts of its users when they close their account via a CCPA / GDPR request.

    • DrNeurohax@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      2 years ago

      If you’re using the main repo for PDS then you probably have the one that doesn’t pause fro 5 secs between API calls (Reddit’s limit). The first fork version has the pause and works correctly, though slowly. Just be aware that there’s a bug in PDS that stops adding to the exported file if it hits an error (If you have 100 comments and get an error on comment #15 it will continue to edit/delete, but the exported file will only have 14 comments.)

    • May@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      2 years ago

      it seems that reddit are delaying following through with most requests until after 1 July when API requests (such as those that shreddit uses) will be blocked.

      I was sooo worried about this and thinking that something like that would be done, back when i saw someone warn in the save 3rd party apps sub that u should request your data. Still i tried making a request bc i thought maybe reddit did not catch on yet or maybe bc it was before the blackout there can still be a chance, but till now i never got the data. :(

      probably i’ll just leave the comments and posts. I did not post a lot.

      • DrNeurohax@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        2 years ago

        There’s also a semi-auto delete user script that doesn’t use the API called so-long-reddit-thanks-for-all-the-fish.

        You go to your comments page, click a button, and it performs the actions within the browser. Without any further interaction, you’ll see the screen scroll to the bottom, click edit on the last comment, enter the text in the script (default is a link to the script, but you can change that to anything), click save, and move on to the next comment (pretty sure it can delete, too). For best results, use a neverending Reddit script and keep scrolling until there are no more pages loaded. Also, re-sort the comments by each option (top, newest, etc.) to check for any stragglers.

        You can still use your browser, though I recommend keeping the task in it’s own window (in case your browser or an addon unloads pages you haven’t accessed in x minutes). If you do something that makes the browser lag a little, it can cause the script to miss a comment, so you might need to run it twice. I used this on one account and it worked flawlessly for several thousand comments and skipped ~10, or so.

        • abff08f4813c@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 years ago

          Yeah, you are right. It’d be tough to directly modify PDS as that’s javascript in a browser and there are strict restrictions on what JS can do on a filesystem in that case.

          But maybe someone can create a browser extension that does the same job. Extensions have fewer restrictions so maybe it could be fueled by a file.

          Or maybe someone will some up with some kind of shell script that can read the archive and copy & paste the URLs for each of your posts and comments, one by one, into the javascript console of your browser, allowing PDS to take care of the rest (visiting each one and simulating hitting the edit and delete buttons).

          The other issue is that PDS depends on old dot reddit dot com currently from what I understand. If that ever gets dropped, PDS will break until it’s updated to work with new reddit.