WorkingLemmy@lemmy.world to Open Source@lemmy.ml · 2 days agoFOSS infrastructure is under attack by AI companiesthelibre.newsexternal-linkmessage-square22fedilinkarrow-up1359cross-posted to: technology@beehaw.orglinux@programming.devtechnology@lemmy.worldopensource@jlai.lutechnology@lemmy.world
arrow-up1359external-linkFOSS infrastructure is under attack by AI companiesthelibre.newsWorkingLemmy@lemmy.world to Open Source@lemmy.ml · 2 days agomessage-square22fedilinkcross-posted to: technology@beehaw.orglinux@programming.devtechnology@lemmy.worldopensource@jlai.lutechnology@lemmy.world
minus-squarejagged_circle@feddit.nllinkfedilinkEnglisharrow-up5·2 days agoIts absolutely sustainable. Just cache it. Done.
minus-squareStrawberrylinkfedilinkarrow-up4·1 day agoThe bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.
minus-squarejagged_circle@feddit.nllinkfedilinkEnglisharrow-up1·edit-27 hours agoOf course you can. This is why people use CDNs. Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.
minus-squareLiveLM@lemmy.ziplinkfedilinkEnglisharrow-up41·2 days agoI’m sure that if it was that simple people would be doing it already…
Its absolutely sustainable. Just cache it. Done.
The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.
Of course you can. This is why people use CDNs.
Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.
I’m sure that if it was that simple people would be doing it already…