Article Header

Deepfake CFO Scams in Finance: A Simple Verification Playbook

The age of deepfake fraud is here — and it targets finance first

Deepfake scams do not need to "hack" your systems to cause real damage. They just need to hack trust.

In the Arup case (reported by Hong Kong police and covered widely), a finance employee joined what looked like a normal video call with the company's CFO and colleagues. The request sounded urgent and confidential. The faces looked familiar. The employee transferred funds, and the company lost about $25 million USD.

This is the uncomfortable truth about deepfakes: when the video looks real, our brains stop asking the right questions.

What happened (in plain language)

Across the reporting, the pattern is consistent:

  • The employee received a suspicious email claiming to be from the UK-based CFO, requesting a confidential transaction.
  • The employee was initially skeptical and dismissed it as possible phishing.
  • The employee was then invited to a video conference.
  • On the call, multiple "familiar" faces appeared to be present — but they were deepfake imitations.
  • The employee complied with the request and transferred funds.
  • Only later did they confirm internally that the meeting and instructions were fake.

The key detail: this was not "reckless" behavior. It was a cautious person who got overridden by a convincing "social proof" moment.

Why this works: the "many familiar faces" trap

Most finance controls assume one of two things is true:

  1. A request coming from a senior leader is probably legitimate.
  2. A video call is strong proof of identity.

Deepfakes break both assumptions.

Attackers do not need to perfectly imitate one person if they can overwhelm you with credibility signals: multiple faces, multiple voices, a realistic meeting vibe, and a tight timeline. The goal is to make you feel like you are the only one slowing things down.

The real fix is not "spot the deepfake"

Yes, there are sometimes visual tells (odd lighting, lip-sync issues). But relying on humans to catch those details is not a long-term defense. Especially in finance,in crisis management, in incident response, and in all cases where people are busy and the pressure is real.

A stronger approach is to change the rule from:

  • "If it looks like my CFO, it is my CFO."

to:

  • "If it is my CFO, they can pass our verification step."

That is exactly where VerifyHuman fits.

Where VerifyHuman helps in this exact use case

VerifyHuman is designed around a simple idea: trust is something you build once, then you can verify quickly when it matters.

That matters for finance teams because the highest-risk moments are predictable: approvals, bank detail changes, urgent payments, and anything "confidential".

1) Build trust once

VerifyHuman is not meant to magically identify a total stranger. Instead, it supports a trust-building process where two parties establish a trusted connection ahead of time. Once that relationship is established, future checks become fast and meaningful. In the Enterprise version, this trust is established by the organization, beforehand.

2) Verify on video

If someone claims to be a leader on a video call, VerifyHuman adds a quick step that is hard to fake in real time: A short, time-bound verification check (for example, using a QR flow). If the person cannot complete it, you do not proceed, even if the face looks right.

3) Verify on audio-only

Deepfake scams do not always require video. Voice impersonation is rising fast, and scammers often use excuses like "camera is broken". VerifyHuman supports audio-only verification using a simple one-time code shared between trusted parties. In the enterprise version, the one-time code of each individual is established by the organization.

A finance-friendly playbook

Here is a lightweight process that finance teams can adopt without turning every payment into a bureaucracy.

Step 1: Pre-establish trust

Do this once, calmly, when there is no pressure:

  • CFO ↔ Finance team
  • CEO ↔ Finance lead

Step 2: Add a "verification gate"

Trigger the gate when any of these are true:

  • Urgent or secret payment request
  • New bank details / changed bank details
  • Unusual amount or unusual destination
  • Pressure to bypass normal approvals

Step 3: Choose the right verification method

  • Video call: use the video verification step
  • Voice-only: use the one-time code verification step

Step 4: If verification fails, stop and switch channels

If they can not verify, do not argue. Do not negotiate.

  • Pause the payment
  • Escalate internally
  • Confirm via an independent channel you already trust

The takeaway

Hong Kong authorities have linked deepfake-assisted fraud to broader identity abuse. It is a reminder that "seeing a face" is no longer the same as "knowing a person".

Want a simple "deepfake-safe" approval flow for your team?

Deepfake scams are not just a tech problem — they are a workflow problem. The goal is to make the safe action, the easy action.

Learn more in our FAQ