top of page

AI Scams Are Getting Smarter—Are You? A Brief Guide to Outsmarting Deepfakes

  • Martin Fridson, CFA
  • 3 hours ago
  • 3 min read

Artificial Intelligence promises huge advances in economic efficiency, but it has a dark side. We are not talking here about future visions of a world controlled by sentient robots. The hazard is already present in the form of audio deepfakes that can inflict ruinous financial harm.


You may have heard of the grandparent scam. The target of the fraud receives an email purporting to be from a grandchild who has been kidnaped, arrested, or injured and desperately needs to receive money. The intended victim may not be taken in, knowing that emails can easily be faked.  But what if the intended victim receives a phone call in which the grandchild’s recognizable voice carries on a conversation detailing the predicament that requires immediate cash? 


The Wall Street Journal recently published an account of a woman receiving a call on her cellphone and hearing her daughter tell her “something awful” had happened and that she needed help. A man then came on the call and explained that the woman’s daughter had witnessed his drug deal, screamed at the sight of guns, and scared away his buyers, leaving him with a lot of unsold cocaine. He said that if the woman ever wanted to see her daughter again, she would have to send him $1,000 through Mexico via Western Union. The woman complied and sent another $1,000 after the man made further threats. Only later did she discover that it was a hoax and that her daughter had never been on the phone call. 


Recent advances in generative artificial intelligence (AI), which mimics people’s images and voices, make this sort of deepfake fraud possible. Just in the past few months the necessary computing costs have dropped dramatically. According to Ben Colman, co-founder and CEO of Reality Defender, a deepfake detection company, setting up a voice-cloning scam formerly required cloud computing, but now it can be accomplished on an ordinary laptop or cell phone.


Little Protection by AI Programs


Disturbingly, most of the leading AI voice cloning programs have no effective barriers to prevent criminals from impersonating people without their consent. A Consumer Reports investigation examined the safeguards of the six leading AI voice cloning tools that are publicly available. Five of the six safeguards turned out to be easy to bypass. Government regulation in this area is light.


As a result, anyone who registers for an account can perpetrate a con by uploading an audio of an individual’s speech from a source such as a TikTok or YouTube video. The fraudster gets the AI service to imitate the person’s voice. A synthetic audio file is created and the swindle is underway.


Precautions You Can Take


Your risk from AI voice-cloning software extends beyond getting hoodwinked by somebody else’s faked voice. Swindlers could, for example, use a copy of your voice to instruct financial institutions to transfer funds into their own accounts. A key defense is to limit the availability of audio recordings of your voice. For instance, if you use an out-of-office recorder to take phone messages, opt for a prerecorded message instead of recording your own message. Avoid putting your recorded voice on social media sites.


Clones of your voice may be used in conjunction with other stolen personal information to perpetrate a fraud. It is therefore imperative to keep details such as your full name, address, and financial information off public platforms. Other standard security measures are also advisable:


  • Use lower resolution or watermarks on photos and videos that you share, as high-quality ones can be used to create deepfakes.

  • Enable two-factor authentication (2FA).

  • Avoid clicking on links or attachments from unknown sources.

  • Install a well-attested antivirus program.

  • Stay informed about new types of deepfake scams.


AI technology is advancing rapidly, providing immense new benefits but also increasing the risk of

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
Featured Posts
Recent Posts
Please Click Here For Important Disclosures and a Customer Relationship Summary:
- FORM CRS (Customer Relationship Summary)
- FORM ADV (Firm Brochure)

This site has some hyperlinks to other websites and blogs. Such external Internet addresses contain information created, published, maintained, or otherwise posted by institutions or organizations independent of us. Lehmann Livian Fridson Advisors LLC (LLFA) does not endorse, approve, certify, or control third party content and does not guarantee or assume responsibility for the accuracy, completeness, efficacy, timeliness, or correct sequencing of information located at such addresses. Use of any information obtained from such addresses is voluntary, and reliance on it should only be undertaken after an independent review of its accuracy, completeness, efficacy, and timeliness. Reference therein to any specific commercial product, process, or service by trade name, trademark, service mark, manufacturer, or otherwise does not constitute or imply endorsement, recommendation, or favoring by LLFA.

 

© 2020 by Lehmann Livian Fridson Advisors, LLC  |   PRIVACY NOTICE  |  IMPORTANT DISCLAIMER  |

 

bottom of page