The once-prophesized future where cheap, AI-generated trash content floods out the hard work of real humans is already here, and is already taking over Facebook.
This is going to get soooo much more treacherous as this becomes ubiquitous and harder to detect. Apply the same pattern, but instead of wood carvings, it’s an election, or sexual misconduct trial, or war.
Our ability to make sense of things that we don’t witness personally is already in bad shape, and it’s about to get significantly worse. We aren’t even sure how bad it is right now.
Imagine an image like Tank Man, or the running Vietnamese girl with napalm burns on her skin, but AI generated at the right moment.
It could change the course of nations.
Most people don’t remember this, or weren’t alive at the time, but the whole Colin Powell event at the UN was intended to stop the weapons inspectors.
France (remember the Freedom Fries?) wanted to allow the weapons inspectors to keep looking until they could find true evidence of WMDs. The US freaked out because France said it wasn’t going to support an invasion of Iraq, at least not yet, because the inspectors hadn’t found anything. That meant that the security council wasn’t going to approve the resolution, which meant that it was an unauthorized action, and arguably illegal. In fact, UN Secretary General Kofi Annan said it was illegal.
Following the passage of Resolution 1441, on 8 November 2002, weapons inspectors of the United Nations Monitoring, Verification and Inspection Commission returned to Iraq for the first time since being withdrawn by the United Nations. Whether Iraq actually had weapons of mass destruction or not was being investigated by Hans Blix, head of the commission, and Mohamed ElBaradei, head of the International Atomic Energy Agency. Inspectors remained in the country until they withdrew after being notified of the imminent invasion by the United States, Britain, and two other countries.
On February 5, 2003, the Secretary of State of the United States Colin Powell gave a PowerPoint presentation[1][2] to the United Nations Security Council. He explained the rationale for the Iraq War which would start on March 19, 2003 with the invasion of Iraq.
The whole point of Colin Powell burning all the credibility he’d built up over his entire career was to say “we don’t care that the UN weapons inspectors haven’t found anything, trust me, the WMDs are there, so we’re invading”. Whether or not he (or anybody else) truly thought there were WMDs is a bit of a non-issue. What matters was they were a useful pretext for the invasion. Initially, the US probably hoped that the weapons inspectors were going to find some, and that that would make it easy to justify the invasion. The fact that none had been found was a real problem.
In the end, we don’t know if it was a lie that the US expected to find WMDs in Iraq. Most of the evidence suggests that they actually thought there were WMDs there. But, the evidence also suggests that they were planning to invade regardless of whether or not there were WMDs.
Great summary 👏 I definitely have some cached thoughts about that era, but didn’t remember it that clearly. That WP page with the actual PowerPoint slides is wild.
It’s already happening to some extent (I think still a small extent).
I’m reminded of this Ryan Long video making fun of people who follow wars on Twitter.
I can say the people who he’s making fun of are definitely real: I’ve met some of them.
Their idea of figuring out a war or figuring out which side to support basically comes down to finding pictures of dead babies.
At 1:02 he specifically mentions people using AI for these images, which has definitely been cropping up here and there in Twitter discussions around Israel-Palestine.
Exactly-- They’re two sides of the same coin. Being convinced by something that isn’t real is one type of error, but refusing to be convinced by something that is real is just as much of an error.
Some people are going to fall for just about everything. Others are going to be so apprehensive about falling for something that they never believe anything. I’m genuinely not sure which is worse.
We already saw that with nothing more than two words. Trump started the “fake news” craze, and now 33% of Americans dismiss anything that contradicts their views as fake news, without giving it any thought or evaluation. If a catch phrase is that powerful, imagine how much more powerful video and photography will be. Even in 2019 there was a deep fake floating around of Biden with a Gene Simmons tongue, licking his lips, and I personally know several people who thought it was real.
Great example. Yeah, I’ve had to educate family members about deepfakes because they didn’t even know that they were possible. This was on the back of some statement like “the only way to know for sure is to see video.” Uh… Sorry fam, I have some bad news…
It already happening. Adobe is selling them but even if they weren’t it’s not hard to do.
I think the worst of it is going to be places like Facebook where people already fall for terrible and obvious Photoshop images. They won’t notice it there are mistakes, even as AI gets better and there are fewer mistakes (Dall-E used to be awful at hands, not so bad now). However even smart folks will fall for these.
This is going to get soooo much more treacherous as this becomes ubiquitous and harder to detect. Apply the same pattern, but instead of wood carvings, it’s an election, or sexual misconduct trial, or war.
Our ability to make sense of things that we don’t witness personally is already in bad shape, and it’s about to get significantly worse. We aren’t even sure how bad it is right now.
Imagine an image like Tank Man, or the running Vietnamese girl with napalm burns on her skin, but AI generated at the right moment.
It could change the course of nations.
As lies always could.
There were no WMDs in Iraq.
Sure, but now you can make a video of Saddam giving a tour of a nuclear enrichment facility.
Wearing a tutu. With Dora the explorer.
Most people don’t remember this, or weren’t alive at the time, but the whole Colin Powell event at the UN was intended to stop the weapons inspectors.
France (remember the Freedom Fries?) wanted to allow the weapons inspectors to keep looking until they could find true evidence of WMDs. The US freaked out because France said it wasn’t going to support an invasion of Iraq, at least not yet, because the inspectors hadn’t found anything. That meant that the security council wasn’t going to approve the resolution, which meant that it was an unauthorized action, and arguably illegal. In fact, UN Secretary General Kofi Annan said it was illegal.
https://en.wikipedia.org/wiki/United_Nations_Security_Council_and_the_Iraq_War
https://en.wikipedia.org/wiki/Colin_Powell's_presentation_to_the_United_Nations_Security_Council
The whole point of Colin Powell burning all the credibility he’d built up over his entire career was to say “we don’t care that the UN weapons inspectors haven’t found anything, trust me, the WMDs are there, so we’re invading”. Whether or not he (or anybody else) truly thought there were WMDs is a bit of a non-issue. What matters was they were a useful pretext for the invasion. Initially, the US probably hoped that the weapons inspectors were going to find some, and that that would make it easy to justify the invasion. The fact that none had been found was a real problem.
In the end, we don’t know if it was a lie that the US expected to find WMDs in Iraq. Most of the evidence suggests that they actually thought there were WMDs there. But, the evidence also suggests that they were planning to invade regardless of whether or not there were WMDs.
Great summary 👏 I definitely have some cached thoughts about that era, but didn’t remember it that clearly. That WP page with the actual PowerPoint slides is wild.
Sure, but now you’ll be able to sway all those people who were on the fence about believing the lie until they see the “evidence”
It’s already happening to some extent (I think still a small extent). I’m reminded of this Ryan Long video making fun of people who follow wars on Twitter. I can say the people who he’s making fun of are definitely real: I’ve met some of them. Their idea of figuring out a war or figuring out which side to support basically comes down to finding pictures of dead babies.
At 1:02 he specifically mentions people using AI for these images, which has definitely been cropping up here and there in Twitter discussions around Israel-Palestine.
And it almost certainly will. Perhaps has already.
and the flipside is also a problem
Now legitimate evidence can be dismissed as “AI generated”
Relevant:
https://twitter.com/bristowbailey/status/1625165718340640769?lang=en
Well that’s a new one lol, hadn’t thought of that. That’s another level of planning
Exactly-- They’re two sides of the same coin. Being convinced by something that isn’t real is one type of error, but refusing to be convinced by something that is real is just as much of an error.
Some people are going to fall for just about everything. Others are going to be so apprehensive about falling for something that they never believe anything. I’m genuinely not sure which is worse.
We already saw that with nothing more than two words. Trump started the “fake news” craze, and now 33% of Americans dismiss anything that contradicts their views as fake news, without giving it any thought or evaluation. If a catch phrase is that powerful, imagine how much more powerful video and photography will be. Even in 2019 there was a deep fake floating around of Biden with a Gene Simmons tongue, licking his lips, and I personally know several people who thought it was real.
Great example. Yeah, I’ve had to educate family members about deepfakes because they didn’t even know that they were possible. This was on the back of some statement like “the only way to know for sure is to see video.” Uh… Sorry fam, I have some bad news…
Analog is the way to go now
It already happening. Adobe is selling them but even if they weren’t it’s not hard to do.
I think the worst of it is going to be places like Facebook where people already fall for terrible and obvious Photoshop images. They won’t notice it there are mistakes, even as AI gets better and there are fewer mistakes (Dall-E used to be awful at hands, not so bad now). However even smart folks will fall for these.