TLDR:
– Voice cloning through artificial intelligence (AI) is being used for scams.
– A report revealed that 47% of surveyed Indians have either been a victim or knew someone who had fallen prey to an AI-generated voice scam.
– India is a major target for AI voice clone scams, with the country having the maximum number of victims to such scams.
– Scammers rely on creating a sense of urgency to exploit people’s vulnerability.
– The global market for AI voice cloning applications is estimated to touch almost $5 billion in 2032 with a CAGR above 15-40%.
A few years ago, voice cloning through Artificial Intelligence (AI) was just a phenomenon of mild amusement. AI-generated songs by famous artistes like Drake and Ariana Grande were floating around online. However, fears around the AI software were realised when AI voice cloning-related scams burgeoned. A report titled ‘The Artificial Imposter’ published in May last year revealed that 47% of surveyed Indians have either been a victim or knew someone who had fallen prey to an AI generated voice scam.
Similar incidents have made their way in India. A report titled ‘The Artificial Imposter’ published in May last year revealed that 47% of surveyed Indians have either been a victim or knew someone who had fallen prey to an AI generated voice scam. The numbers are almost twice the global average of 25%. In fact, India topped the list with the maximum number of victims to AI voice scams.
While these tools aren’t perfect, scammers have relied on creating a sense of exigency to glide over these flaws. The report also shared that 86% Indians were prone to sharing their voice data online or via voice notes at least once a week which has made these tools potent.
Once a scammer finds an audio clip of an individual, all it takes is to upload their voice clip to the online program that is able to replicate the voice accurately barring some intonations. There’s a host of these applications online with popular ones like Murf, Resemble and Speechify.
The speed and easy access of these tools have sent alarm bells ringing. In November last year, the U.S. Federal Trade Commission or FTC launched a Voice Cloning Challenge which asked the public to send in their ideas to detect, evaluate and monitor cloned devices. The FTC is also considering the adoption of a recently-proposed Impersonation Rule that will help deter deceptive voice cloning.
A few years ago, voice cloning through Artificial Intelligence (AI) was just a phenomenon of mild amusement. AI-generated songs by famous artistes like Drake and Ariana Grande were floating around online. However, fears around the AI software were realised when AI voice cloning-related scams burgeoned.
The global market for AI voice cloning applications stands at $1.2 billion in 2022 and is estimated to touch almost $5 billion in 2032 with a CAGR above 15-40%.