I realize my options are limited, but what about any robots.txt style steps? Thanks for any suggestions.

  • ShellMonkey@piefed.socdojo.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 months ago

    A robots txt with a bunch of scraper user agents is a start, but there’s no guarantee they honor it. A firewall with some dynamic lists of known scanners and scrapers is a bit more forceful method.