We should be using AI to pump the web with nonsense content that later AI will be trained on as an act of sabotage. I understand this is happening organically; that’s great and will make it impossible to just filter out AI content and still get the amount of data they need.
That sounds like dumping trash in the oceans so ships can’t get through the trash islands easily anymore and become unable to transport more trashy goods. Kinda missing the forest for the trees here.
Alternatively, and possibly almost as useful, companies will end up training their AI to detect AI content so that they don’t train on AI content. Which would in turn would give everyone a tool to filter out AI content. Personally, I really like the apps that poison images when they’re uploaded to the internet.
We should be using AI to pump the web with nonsense content that later AI will be trained on as an act of sabotage. I understand this is happening organically; that’s great and will make it impossible to just filter out AI content and still get the amount of data they need.
That sounds like dumping trash in the oceans so ships can’t get through the trash islands easily anymore and become unable to transport more trashy goods. Kinda missing the forest for the trees here.
Seamines against climate change
My shitposting will make AI dumber all on its own; feedback loop not required.
Alternatively, and possibly almost as useful, companies will end up training their AI to detect AI content so that they don’t train on AI content. Which would in turn would give everyone a tool to filter out AI content. Personally, I really like the apps that poison images when they’re uploaded to the internet.
Bold of you to assume companies will release their AI detection tools
Force the AI folks to dev accurate AI detection tools to screen their input