Top and Current
Source: (remove) : WTVF
RSSJSONXMLCSV
Top and Current
Source: (remove) : WTVF
RSSJSONXMLCSV

AI voice cloning scams target loved ones and threaten your finances


Published on 2025-04-29 10:21:00 - WTVF
  Print publication without navigation

  • Imagine getting a call from a loved one terrified, desperate, begging for help. But what if that voice wasn't real? Scammers have now found a way to use AI to steal your money.

AI voice cloning scams are increasingly targeting individuals by mimicking the voices of loved ones to deceive them into sending money, as highlighted by a recent incident where a mother received a call from someone sounding exactly like her daughter, claiming to be in distress and needing funds. These scams exploit the trust and emotional bonds between family members, using easily accessible AI technology to clone voices from short audio clips found on social media. Consumer Reports warns that these scams are becoming more sophisticated and harder to detect, urging people to establish a secret code with family members to verify their identity during suspicious calls. Additionally, they advise against sharing personal information or sending money without confirming the caller's identity through other means, and to report any such incidents to the Federal Trade Commission.

Read the Full WTVF Article at:
[ https://www.newschannel5.com/money/consumer/consumer-reports/ai-voice-cloning-scams-target-loved-ones-and-threaten-your-finances ]