The progressive development of artificial intelligence has numerous advantages, but also a dark side – human greed!
The latest trend in the crime market is to use AI-powered voice cloning technology to deceive unsuspecting victims over the phone and obtain money or confidential information.
The technology’s potential for abuse increasingly exceeds the public’s ability to keep up with the tricks of bad actors, as evidenced recently by several fraud cases from Canada. But it is not only criminals who use technology for their own benefit: companies threaten entire professions by replacing voice actors and synthesising their voices with AI, often without adequate compensation.
Grandchild trick 2.0: Fraudsters use artificial intelligence to imitate the voices of relatives and deceive unsuspecting victims.
In Canada, there have recently been cases of Grandchild Trick 2.0 where fraudsters use artificial intelligence to imitate the voices of relatives and trick unsuspecting victims out of their money. A woman was fooled by a phone call into thinking her supposed grandson was in urgent need of money. She was so convinced by the voice that she withdrew the maximum amount from her bank. Fortunately, a bank employee became aware of the fraud attempt and prevented worse. But not all victims are so lucky. Another couple in Canada became convinced through the use of AI that their son was involved in an accident with US diplomats and transferred a substantial amount without suspecting fraud. The perfect imitation of voices by artificial intelligence makes it very difficult for unsuspecting victims to detect and protect themselves against such AI-assisted scams.
How does the AI get the voices?
According to Hany Farid, a professor of forensics at UC Berkeley. cloning a voice was very complex a year or two ago. Now, however, 30 seconds of audio recording on social media such as Facebook or TikTok is enough to perfectly imitate a voice.
But companies are also not afraid to use AI voice cloning technology for questionable practices and profit-making purposes.
One example of this is the startup ElevenLabs, a company that offers an AI voice synthesis service for just $5 a month. The results are so convincing that one journalist even hacked his own bank account with it.
But that’s not all: AI has even spawned a new genre of memes imitating President Joe Biden. The availability of ElevenLabs’ voice cloning – only since 2022 – shows the enormous potential AI offers for voice imitation and how quickly this technology is developing.
The impact of artificial intelligence is not limited to fraudsters, but also has implications for the world of work.
The voice acting industry in particular is suffering from the use of AI. According to a report by Vice there is a worrying trend in the industry in the US where actors are signing contracts that allow clients to synthesise their voices using AI to use them for as long as they want without paying additional compensation. Some actors are even forced to accept such clauses in order to be hired at all.
The AI clauses in these contracts are often deceptively embedded, and many voice actors may have unwittingly signed such a contract and found that their voices were synthesised into AI apps and websites without giving their consent.
This has implications for the entire industry and shows that the use of artificial intelligence can also have unexpected effects on the world of work. Many well-known voice actors have raised their voices and expressed their outrage at the use of AI to imitate their voices. It is important that this industry and other professions affected by AI maintain control over their work and ensure that their work is fairly compensated and respected.
We need to realise how much the technologies of our time not only inspire our good intentions, but also our abysses. There is an urgent need for moral discussion and global consensus on our digital ethics to protect us from the tricks of bad actors and exploitation by business. One of our authors, Thomas-Gabriel Rüdiger, dealt with this issue and has clear recommendations.