Banker's Compliance Consulting Blog

Deepfake Media Fraud Schemes

Written by Kevin Edwards | Dec 9, 2024 7:55:54 PM

Artificial intelligence (AI) is a hot topic right now and while it is very useful in certain applications, it can also be an area of great risk. Financial institutions must be aware of the potential implications not only of how they are using AI but how their customers and even criminals are using it. FinCEN recently issued an Alert (FIN-2024-Alert004) on “Fraud Schemes Involving Deepfake Media Targeting Financial Institutions”.

Generative artificial intelligence is capable of producing highly realistic videos, pictures, audio and text, which can make it appear someone said something they did not. “Deepfake” refers to content that is highly realistic. This type of deepfake media can be very effective to carry out fraud schemes which often involve creating fake (and/or altering) identifying documents (e.g., driver’s licenses, passport cards, etc.) which are intended to exploit an institution’s identity verification process. Once an account is opened with a fraudulent identity, it can be used to further other fraud, such as online scams, check fraud, credit card fraud, loan fraud, unemployment fraud, and “authorized push payment fraud” (i.e., tricking someone into authorizing a payment).

Published
2024/12/09