• Strawberry
    link
    fedilink
    arrow-up
    2
    ·
    7 hours ago

    The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.