@[email protected] to [email protected]English • 27 days agoNo one’s ready for this: Our basic assumptions about photos capturing reality are about to go up in smoke.www.theverge.comexternal-linkmessage-square175fedilinkarrow-up1465arrow-down143cross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected]
arrow-up1422arrow-down1external-linkNo one’s ready for this: Our basic assumptions about photos capturing reality are about to go up in smoke.www.theverge.com@[email protected] to [email protected]English • 27 days agomessage-square175fedilinkcross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected]
minus-square@[email protected]linkfedilinkEnglish4•edit-227 days agoIt’s fundamentally not possible. At some point fakes will be pixel perfect indistinguishable.
minus-square@[email protected]linkfedilinkEnglish1•27 days agoI’ve just tried to upload the picture of the girl with fake drugs on the floor in a AI detection tool and it told me it was 0,2% likely to have AI generated content. This does not look good.
It’s fundamentally not possible.
At some point fakes will be pixel perfect indistinguishable.
I’ve just tried to upload the picture of the girl with fake drugs on the floor in a AI detection tool and it told me it was 0,2% likely to have AI generated content. This does not look good.