Fake4000@lemmy.world to Technology@lemmy.worldEnglish · 1 年前Reddit started doing what they always wanted to do, sell user content to AI.www.reuters.comexternal-linkmessage-square203linkfedilinkarrow-up11.1Karrow-down114cross-posted to: [email protected][email protected]
arrow-up11.09Karrow-down1external-linkReddit started doing what they always wanted to do, sell user content to AI.www.reuters.comFake4000@lemmy.world to Technology@lemmy.worldEnglish · 1 年前message-square203linkfedilinkcross-posted to: [email protected][email protected]
minus-squareJohnEdwa@sopuli.xyzlinkfedilinkEnglisharrow-up12·1 年前Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary. Archive.org bot for example has completely ignored it since 2017.
Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary.
Archive.org bot for example has completely ignored it since 2017.