Voice Cloning News Logo




2023-05-29
After numerous reports of malicious voice-cloning activities, the Albuquerque Journal is reporting that family safe words may provide a way to thwart AI voice-cloning scams. Scammers have increasingly been using AI voice-cloning to target vulnerable individuals. So far, there hadn't been an effective solution to combat these scams - until now. According to the Albuquerque Journal, family safe words may provide an answer to thwarting AI voice-cloning scams. AI voice-cloning is the process of using recorded audio of a person’s voice to create a clone of that person’s voice. This clone can then be used in a malicious scam, to fool someone into thinking they are speaking with a legitimate person. Family safe words provide a solution to this problem. The idea is to have each family member create a unique safe word. If someone calls and claims to be a family member who is using a recordings of their voice, the individual on the other end of the line would be required to provide the unique safe word in order to verify identity. It's important to note that family safe words should not be used for other types of communications, as it will only protect against AI voice-cloning scams. While this may not be a 100% foolproof method, it could provide an additional layer of security to families who are worried about being targeted by these kinds of scams. Protect Your Family from AI Voice-Cloning Scams: How Family Safe Words Can Help - Albuquerque Journal