Just saw on Titus Tech Talk that torrents are last decade, and newsgroups is where it’s at for this stuff. Of course he didn’t elaborate, so I need some help here.

What is he talking about, and what are these groups that can I enter er, avoid?

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    22
    ·
    10 months ago

    From a user aspect: nowadays all that is burried in/handed by the usenet client you use.

    Downloading from usenet is very similar to torrenting in that you receive an index file (.nzb) that is effectively equivalent to a torrent file. You pass that to your usenet client, and it’ll handle downloading each of the parts, called articles, then stitching them together into the actual file shares. (while even recovering missing/corrupted data via added parity data)

    The big difference is you’re downloading each of these articles from whichever usenet providers you’ve configured; instead of from random individual peers discovered through public/private trackers.

    Usenet providers usually offer more consistent and faster speeds, typically saturating my disk write speed; where as torrent peers are often slow or unreliable in comparison. Also as it’s a standard tls connection between you and a private service, and you don’t have to re-upload the data you download; you’re not exposed to copyright claimants and don’t need a vpn.

      • Fitzsimmons
        link
        fedilink
        English
        arrow-up
        12
        ·
        10 months ago

        Been like a decade since I touched usenet but I do recall that requests were pretty common. Especially since the content expires. With a 5 year old torrent there’s a decent chance you’ll find a couple of seeders even on a public tracker and get it eventually, but with usenet that stuff does eventually rot away and you’ll have to request a reup.

        • astanix@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          I mean usenet servers are running with insane retention… omicron hosts have 5648 day retention, other backbones are over 4500

          Unless it gets taken down, it doesn’t go away anymore… providers just keep retaining. I suppose that will end eventually… maybe some day the cost of storage will prohibit archiving 15 year old binary usenet posts.

      • Darkassassin07@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        That’s down the the indexer you decide to use. The one I use (NZBGeek) does have a requests section where you can enter an IMDB id, TVDB id, or just a general description and any other necessary/desired details like quality and they’ll be filled by volunteers.

        TBH not something I’d actually looked into until now. I’m gonna go drop a request or two in there right now. There’s not much I’m missing, but the things I am I haven’t been able to find regardless of source.