A Florida woman was recently defrauded of $15,000 after falling victim to an AI voice cloning scam. The scammer replicated her daughter's voice, claiming she had been involved in a car accident and needed money for bail. The mother received a phone call where she heard her daughter's voice hysterically claiming she had hit a pregnant woman while texting and driving. The scammer, posing as her daughter's attorney, instructed her to withdraw cash and hand it over to a courier, further deceiving the victim.
AI voice cloning technology requires only a short audio sample to convincingly mimic a person's voice. This allows scammers to create realistic scenarios that exploit emotional vulnerabilities, particularly among family members. Scammers often gather voice samples from social media.
Law enforcement officials warn that these scams are becoming increasingly sophisticated. They advise individuals to verify emergency calls through independent contact methods and to be wary of urgent requests for money. Raising awareness and implementing stricter AI regulations are crucial to combatting these scams.