AI may be a buzzword on Wall Street, but on the West Coast itâs at the center of Hollywoodâs biggest labor dispute in more than 50 years. Among those warning about the technologyâs potential to cause harm is British actor and author Stephen Fry, who told an audience at the CogX Festival in London on Thursday about his personal experience of having his identity digitally cloned without his permission.
âIâm a proud member of [actorsâ union SAG-AFTRA], as you know weâve been on strike for three months now. And one of the burning issues is AI,â he said.
Actorsâ union SAG-AFTRA, which has around 160,000 members, went on strike last month over pay, working conditions, and concerns related to the use of AI in the film industry. It joined the Writers Guild of Americaâa union representing thousands of Hollywood writersâwhich went on strike in early May, marking the industryâs biggest shutdown in more than six decades.
A key sticking point for actors on strike is the possibility that studios could use AI to make digitally replicate their image without compensating them fairly for using their likeness.
Speaking at a news conference as the strike was announced, union president Fran Drescher said AI âposes an existential threatâ to creative industries, and said actors needed protection from having âtheir identity and talent exploited without consent and pay.â
During his speech at CogX Festival on Thursday, Fry played a clip to the audience of an AI system mimicking his voice to narrate a historical documentary.
âI said not one word of thatâit was a machine. Yes, it shocked me,â he said. âThey used my reading of the seven volumes of the Harry Potter books, and from that dataset an AI of my voice was created and it made that new narration.â
Fryâwho has appeared in movies including Gosford Park, V for Vendetta, and The Hitchhikerâs Guide to the Galaxyâis the narrator of the British Harry Potter audiobooks, while actor Jim Dale narrated the American version of the series.
âWhat you heard was not the result of a mash up, this is from a flexible artificial voice, where the words are modulated to fit the meaning of each sentence,â Fry told the audience at CogX Festival on Thursday.
âIt could therefore have me read anything from a call to storm parliament to hard porn, all without my knowledge and without my permission. And this, what you just heard, was done without my knowledge. So I heard about this, I sent it to my agents on both sides of the Atlantic, and they went ballisticâthey had no idea such a thing was possible.â
Fry added that when he discovered his voice was being used in projects without his consent, he saw it as just the beginning of an emerging threat to creative talent, warning his angry agents: âYou ainât seen nothing yet.â âThis is audio,â he said he told them. âIt wonât be long until full deepfake videos are just as convincing.â
As AI technology has advanced, doctored footage of celebrities and world leadersâknown as deepfakesâhas been circulating with increasing frequency, prompting warnings from experts about artificial intelligence risks. Fry warned on Thursday that those technologies only had further to go.
âWe have to think about [AI] like the first automobile: impressive but not the finished article,â he said, noting that when cars were invented no one could have envisioned how widespread they are today.
âTech is not a noun, it is a verb, it is always moving,â he said. âWhat we have now is not what will be. When it comes to AI models, what we have now will advance at a faster rate than any technology we have ever seen. One thing we can all agree on: itâs a f***ing weird time to be alive.â
Not the first
Fry isnât the only famous actor to publicly vocalize their concerns about AI and its place in the film industry.
At a U.K. rally held in support of the SAG-AFTRA strike over the summer, Emmy-winning Succession star Brian Cox shared an anecdote about a friend in the industry who had been told âin no uncertain termsâ that a studio would keep his image and do what they liked with it.
âThat is a completely unacceptable position,â Cox said. âAnd that is the position that we should be really fighting against, because that is the worst aspect. The wages are one thing, but the worst aspect is the whole idea of AI and what AI can do to us.â
Oscar winner Matthew McConaughey told Salesforce CEO Marc Benioff during a panel event at this yearâs Dreamforce conference that he had concerns about the rise of AI in Hollywood.
âWe have a real chance, if we are irresponsible, of cannibalizing ourselves and creating this digital god that weâll bow to, and weâll all of a sudden become tools of this tool,â he said.
Meanwhile, Star Trek and Mission Impossible star Simon Pegg has called AI âworryingâ for actors.
âWeâre looking at being replaced in some ways,â he said at the rally in London in July. âWe have to be compensated and we have to have some say in how [our image is] used. I donât want to turn up in an advert for something I disagree with⊠I want to be able to hang on to my image, and voice, and know where itâs going.â
A spokesperson for the Alliance of Motion Picture and Television Producers (AMPTP), the entertainment industryâs official collective bargaining representative, was not available for comment when contacted by Fortune.
The much, much, much more concerning aspect of voice cloning technology is that it will be used to scam people on a massive scale.
Imagine you get a call at 4am from a loved one who tells you that they are in an emergency situation and had to borrow a phone to call you. The beg you to venmo some money to a strangerâs account so that they can get their car fixed/get a plane ticket/pay someone back for giving them a lift/etc.
You recognize your loved oneâs voice. They can respond to your questions (because chatbot AI). They know details about your life (because social media). Itâs the middle of the night. Youâre scared and not thinking clearly.
This technology all exists TODAY. In 10 or 20 years itâll be so terrifyingly sophisticated, even the most wary people will be vulnerable to it.
Easy solution, donât have any loved ones. Checkmate scam artists
âHey itâs ur son Iâve been arrested in Mexicoâ
âWell good thenâ.
My uncle was recently arrested in another state. We had a similar reaction.
Someone tried to scam my grandpa with that. He told âmeâ to enjoy rotting in jail then called me up to ask how jail was.
âItâs about time they caught you! Oh wait, they donât want the reward money, do they? Ah fuck it, they canât unarrest you, tell them to get fucked!â
Or if you do, make sure none of 'em are dumb enough to rely on âcash appsâ like venmo. Even Zelle, through our bank is suspicious as shit.
Thatâs why I do like Gilbert Gottfried and do two voices: one in public and one for friends and family.
It gets confusing when we dine outside.
đ Wtf does that guy sound like at home? Posh mid Atlantic accent or some shit? Iâm so curious now.
Nothing, heâs long dead.
He passed away last year. I personally wouldnât call that long dead.
Thereâs a Howard Stern clip of Gilberts ânormalâ voice on his voicemail.
They canât get me if I live in a hole. Not a nasty, dirty, wet hole, filled with the ends of worms and an oozy smell, nor yet a dry, bare, sandy hole with nothing in it to sit down on or to eat: but a hobbit-hole, and that means comfort.
The solution that EVERYBODY needs to learn for something like that is to hang up and call them back using the contact you have in your phone. They can afford 10 seconds while you do that if theyâre calling you for money. And if it isnât them calling for money, well sorry for waking you up Frank, but an AI was posing as you asking for cash.
That and a family or per person verification word or protocol or something.
âClumsyâŠâ
âDraconiquist!â
Thank you, your suggestion has been added to the training data.
/s
âOh my god, agent_flouder, I was just in a car accident and they need the bank info to process the co-insurance so I can get the organ transplant that is expiring in minutes! Iâve lost a lot of blood, have a concussion and have forgotten our code word. PLEASE donât do this right now or this might be the last time you hear from meâŠâ
In the capitalist hellscape that is the US, that isnât that far fetched and with emotions high, I doubt itâs unlikely. On the other hand, I can see a news article that reads, âMan lets daughter die by refusing hospital critical information needed for transplant.â
Lawl they donât do insurance shit for emergencies like needing an organ or blood immediately, they deal with that shit after the operation
Yeah but this is America. And do you think the average person knows that, will remember it when their loved one calls them crying, and will have the temerity to actually refuse when thereâs a time constraint?
Good point.
Unless they are calling from the hospital, police station, borrowed a cell phone after a car accident, etc.
Then you call THAT number. Stationâs non emergency number, hospital, etc.
yet another reason to never answer the phone
Ah, thatâll be the equivalent scam for our age that spam emails are for the age before.
Easy peasy. Tell them okay. Then hang up, proceed to call that loved one using your record of his /her number. Confirm.
In the referenced scenario they had to borrow a phone to call you.
Presumably their phone is out of battery, broken, stolen, or theyâre in another country without service.
So that means the ârealâ person definitely wonât answer their phone right? That all is useful for trying to confirm someone is who they say they are.
Yeah it has the flaw that at that hour the real person might not answer tough⊠(If they shutdown the phone or mute it or whatever). But yeah that is the common approach.
Itâs not perfect, but itâs something.
Deviant had this fascinating/awful video about this kind of situation: https://www.youtube.com/watch?v=6ihrGNGesfI
Here is a (really) top infosec expert saying that when someone you know is in jail, you absolutely have to turn off all your call filtering and spam filtering, because who knows what shitty system the facility theyâve been moved to today is using to route calls.
Well since itâs a scam, it means that the phone is really not broken or out of commission.
Not readily believing anything you hear from something that is unusual or out of the ordinary will save you in the future.
Call just to check. If the phone is unreachable, call someone close to that person, a wife, a son /daughter. Just think. If they have a wife or kids, why call you in the first place.
Sometimes itâs good to be not overly trusting.
deleted by creator
Or even better, conference them in to that call. Then get them to debate each other about which one is real.
Unfortunately, that is already happening⊠https://www.independent.co.uk/tech/ai-voice-clone-scam-kidnapping-b2319083.html
Scam them out of what?
90% of the world will be unemployed and fighting for whatever scraps of food are grown between the constant flooding and fires we had 100 years to prevent but it wasnât profitable to.
We will have to learn to live entirely without factual information as every form of communication becomes hopelessly compromised by corporations, governments and billionaire extremists.
You wonât even be able to trust the people you meet in meatspace. You think Fox News addicts are fucked up? Wait until every piece of entertainment is propaganda thatâs been personalised just for you.
And when it inevitably turns to war and youâre put in charge of the big red button, will you even care if the order to press it has been deep faked by a death cult?
Unemployment and petty scams are small fry. With this technology, we can end the world.
Iâve had friends fall for email scams. Weâre from Canada and I remember one being that another friend of our was stuck in Wales and needed a bank transfer. I knew it was a scam but a few of my friends were worried. I said that I live in the UK now and can take a train to Wales if you REALLY want. They were still panicking and saying they should do it in case. Iâm like, you canât be serious!
So yeah, it can already be bad and with ChatGPT they can pass the Turing test. All of our friends will probably test each other on our memories. âTell me the name of your ex-gf and which year and how you broke again?â
This type of scam exists and, unfortunately, works without voice cloning and social media in East Europe for years.
deleted by creator
Easy enough to teach people to just hang up and call their friend.
deleted by creator
Most people donât know what their loved ones sound like on the phone. This is already a scam and you should never believe someone calling you like that. You can ask them something only you two know or just tell them to call the police and that youâll meet them at the police department or hospital or whatever. Never give out credit card info ect over the phone. Nobody would ever do that in a legit situation.
This has been happening for at least a year
Physical 2 factor authentication. Have a code word for your kids to tell you or for you to tell them.