• Strawberry
      link
      fedilink
      arrow-up
      4
      ·
      1 day ago

      The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.

      • jagged_circle@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 hours ago

        Of course you can. This is why people use CDNs.

        Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.

    • LiveLM@lemmy.zip
      link
      fedilink
      English
      arrow-up
      41
      ·
      2 days ago

      I’m sure that if it was that simple people would be doing it already…