A New External Fraud Threat

Imagine being invited to a virtual meeting during which your boss and their boss instruct you to initiate one or more payments to specified new payees. Even if this is a departure from typical operations, listening to your management and seeing them on the screen is all you need to make their request a priority. Unfortunately, you could be falling into a trap; the entire meeting could be fake.

In the evolving world of artificial intelligence (AI), you can no longer rely on voices or videos/images to ensure a request is legitimate. One Hong Kong company learned this lesson the hard way, becoming the victim of a sophisticated scam. Keep reading to learn more, including what you should do today to help protect your organization.

What Happened

In the case of the Hong Kong company, fraudsters used deepfake technology to impersonate the CFO during a video conference. At the time, the targeted employee did not know anything was amiss and ultimately sent more than $25M out the door. Mary Schaeffer, AP Now, released the following 9-minute podcast episode on this story, sharing the red flags and providing five tips.

About Deepfakes

The simplest description I have found comes from a CBS News article: Deepfakes are videos and images that have been digitally manipulated to depict people saying and doing things that never happened.

Chances are you have seen examples of deepfakes, even if you did not realize it. They can be creative, such as Kendrick Lamar using deepfakes in his music videos, but they can also be detrimental, as we have seen. When I contacted Mary Schaeffer about her podcast episode, she admitted, “I almost threw up when I first read the story, realizing what the implications were for the payor community.” So, what should you do?

What to Do

As the AP Now podcast indicated, spread the word about this new fraud tactic. Deepfakes do not just threaten accounts payable or the accounting and finance operations. Your cardholders could become targets, even though we can assume the dollar amount would be far less than $25M.

Specific Action Items

  • Update the content in your card program training, so that deepfakes are addressed on a recurring basis. It is not enough to send one email alert and expect employees to remember. Make this part of your annual refresher training.

    • Describe potential red flags for employees to watch for.

    • Instruct them on what to do if they question the legitimacy of a request.

  • Test employees’ knowledge. As part of your auditing efforts, try to simulate a deepfake virtual meeting; for example, have a senior leader make an unusual request to see how employees respond.

  • Ask your card issuer about the liability associated with a cardholder falling for a scam like this. Will the usual card protections prevent a monetary loss for your organization?

Upcoming Webinar

Speaking of training, I will be delivering a related webinar for AP Now in March. I encourage all card program administrators and managers to register and attend.

Final Thought

Deepfakes can affect us personally, too. While individuals certainly are not as lucrative as organizations, fraudsters can (and do) still use these tactics for their personal gain. Make sure the vulnerable people (e.g., seniors, young people) in your life are aware.



Subscribe to the Blog

Receive notice of new blog posts.

About the Author

Blog post author Lynn Larson, CPCP, launched Recharged Education in 2014. With more than 20 years of commercial card experience, her mission is to make industry education readily accessible to all. Learn more