Save content
Have you found this content useful? Use the button above to save it to your profile.
Deepfake technology
iStock_Tero Vesalainen_AW_deepfake

Deepfake CFO dupes Hong Kong business into £20m payment


In one of the most elaborate cyber heists ever revealed, criminals used multiple fake videos and artificial intelligence-generated voices to trick a finance worker into making payments totalling £20m.

5th Feb 2024
Save content
Have you found this content useful? Use the button above to save it to your profile.

Fraudsters used deepfake video technology to trick a finance worker at a multinational company into believing they were speaking with the chief finance officer – and ultimately into paying out more than HK$200m (£20m).

South China Morning Post reported that last month a finance clerk at the Hong Kong branch of the unnamed multinational was invited to a video conference via a message purporting to be from the company’s chief financial officer (CFO).

As part of a campaign to raise awareness of new scam tactics, Hong Kong police senior superintendent, Baron Chan Shun-ching told journalists this raised the anonymous employee’s suspicions as it mentioned that secret transactions that needed to be carried out. However, while on the video conference, the employee was joined by what looked and sounded like their colleagues and was reassured enough to continue the call.

Unfortunately for the finance worker, they were the only real person on the video. The CFO and colleagues’ videos were reportedly created using genuine, publicly available video conferences from the past, with audio dubbed in.

“I believe the fraudster downloaded videos in advance and then used deepfake technology to add fake voices to use in the video conference,” said Chan.

Deepfakes use artificial intelligence (AI) tools to create highly convincing fake videos or audio recordings and enable one person’s likeness to be swapped for another.

The deepfake videos were pre-recorded and did not involve interaction with the victim, said Chan. He added that the criminals also used WhatsApp and email messages to communicate with the finance worker to add legitimacy to the con.

The finance worker eventually agreed to send 15 transactions worth a total of HK$200m (£20.2m) to five local bank accounts as instructed. 

The scam was only picked up later after the employee spoke to the company’s head office six days later. 

New deception tactics

“We want to alert the public to these new deception tactics. In the past, we would assume these scams would only involve two people in one-on-one situations, but we can see from this case that fraudsters are able to use AI technology in online meetings, so people must be vigilant even in meetings with lots of participants,” Chan said.

He added that the criminals’ strategy of not engaging directly with the victim beyond requesting a self-introduction made the scam more convincing.

CNN reports that the case is one of a number that the Hong Kong police are investigating where scammers have used deepfake technology to modify publicly available video and other footage to trick people or businesses out of money. At a press briefing last week, they revealed that they had made six arrests relating to similar cases – although, in the fake CFO case, there have been no arrests at present.

In the UK last year finance expert Martin Lewis warned people not to fall victim to a deepfake video that appeared to show him endorsing a fake investment project from Elon Musk, calling the technology “absolutely terrifying”.

Is seeing really believing anymore?

“The old adage ‘seeing is believing’ is deeply ingrained in us, but our visual trust is quickly becoming our weakness,” Stephen Edington, chief product and technology officer at Dext, told AccountingWEB. “While we might doubt words, seeing an event in video form tends to have a stronger impact due to our reliance on visual cues.

“However, technology that can fabricate videos is now readily available at a relatively low cost, challenging our instincts to trust, especially if the content appears to come from a credible source or aligns with our biases,” he continued. 

“Pretty soon, confirming your humanity and identity is going to pose significant challenges. In my opinion, biometrics and cryptography are the only viable options to help establish digital trust. Future online identities and communications will be verified through biometric keys. This level of security will become crucial, and tech giants along with governments will play a key role in implementation. Unsigned digital content may soon be disregarded, while verified interactions become the norm. In the interim we are in for an interesting ride, believe it when you see it – in person.”

Deepfake protection

Francis West, founder of cybersecurity firm Security Everywhere, told AccountingWEB that protecting businesses, especially those in the financial sector, from deepfake scams is becoming increasingly important. He provided a few tips on things accountants and finance professionals may wish to consider if they wish to mitigate the risks associated with deepfake scams.

  1. Employee training and awareness
  • Conduct regular training sessions on recognising deepfake content.
  • Educate employees about the potential risks and consequences of falling for deepfake scams.
  • Encourage a culture of scepticism when receiving video or audio requests, especially those related to financial transactions.
  1. Two-factor authentication (2FA)
  • Implement 2FA for all financial transactions and access to sensitive systems.
  • Require the use of strong, unique passwords in combination with 2FA.
  1. Verification protocols
  • Establish a verification process for high-value transactions or requests received via video or voice.
  • Ensure that video or voice requests are confirmed through a separate channel, such as a phone call or face-to-face communication.
  1. Document and record
  • Keep detailed records of financial transactions and communications.
  • Archive video or audio interactions for future reference.
  1. Encryption and secure communication
  • Use secure communication channels and encryption for sensitive financial information.
  • Verify the authenticity of communication channels before sharing sensitive data.
  1. Identity verification
  • Implement strict identity verification procedures for clients or customers requesting financial transactions.
  • Use multi-step verification processes for significant transactions.
  1. Stay informed
  • Keep abreast of the latest developments in deepfake technology and cybersecurity.
  • Adapt security measures accordingly to stay ahead of potential threats.
  1. Third-party security audits
  • Regularly assess the cybersecurity measures of third-party vendors or service providers.
  • Ensure they have robust safeguards against deepfake-related fraud.
  1. Emergency response plan
  • Develop a clear and well-documented plan for responding to suspected deepfake scams.
  • Assign roles and responsibilities for addressing security incidents promptly.
  1. Collaboration with cybersecurity experts
  • Consult with cybersecurity experts to evaluate and enhance your organisation’s defences against deepfake threats.
  • Consider employing deep learning-based AI tools to detect deepfake content.

If you are a UK reader and think you may have been a victim of fraud or cyber crime, report it to Action Fraud at or, if you live in Scotland, to Police Scotland by calling 101.

Replies (16)

Please login or register to join the discussion.

By FactChecker
05th Feb 2024 23:54

"If you are a UK reader and think you may have been a victim of fraud or cyber crime, report it to Action Fraud at" ... but don't expect anything to happen beyond you spending more time filling in forms.

I could explain more but would have to reveal more in public than I'm prepared to do. Suffice to say that they only regard you as a 'victim' if you have suffered direct personal loss of money .. and are not prepared even to log a crime if you (the person trying to report it) are not a 'victim'.

[Imagine that you are on the ball and notice that criminals have stolen (some of) your identity details and managed to scam money out of a bank or two with whom you have no prior connection ... so you get everything cancelled without suffering personal loss, which means Action Fraud won't even log your report (that you've assiduously completed to help them)!]

None of which detracts from the ghastly enjoyment of reading the article ... but does make me wonder about the suggested preventative measures - particularly when we reach the desperation level of "employing deep learning-based AI tools to detect deepfake content"!

Thanks (9)
Replying to FactChecker:
By JustAnotherUser
06th Feb 2024 08:18

"If you are a UK reader and think you may have been a victim of fraud or cyber crime, report it to Action Fraud at" ... but don't expect anything to happen beyond you spending more time filling in forms.

The singular best use case for the action fraud process is that until you have called them and got a reference number, companies will hold you liable in the event of fraud.

In the event of fraud resulting in debt collection in your name, call action fraud, get a reference number and then call the company chasing the debt, give them the reference number and simply ask "are you still holding me liable", once they have a reference number they will remove you from the process.

This is the singular best piece of advice to give once you are the victim of fraud involving debt in your name.

"Consider employing deep learning-based AI tools to detect deepfake content." - not something your average company should do no, but anyone with a large enough public presence, where face and name alone can sway judgement this is a real problem and the issue will only grow ... what's the saying about 'A lie can travel halfway around the world while the truth is still putting on its shoes'

Thanks (6)
Replying to JustAnotherUser:
By FactChecker
06th Feb 2024 12:53

I'll take your word regarding "the singular best piece of advice to give once you are the victim of fraud" ... but can only repeat that this was not my experience (last year).

* First sign of a 'problem' = massive pile of post (my name, my address) that contained 'welcome packs' to new accounts with banks that I've never used (and including cards, PINs, overdraft confirmations and so on)!
* Agitated (and lengthy) phone-calls from yours truly to each of those banks' fraud lines who without exception ... apologised / refused to divulge what data of mine might have been compromised (citing GDPR!) / confirmed that the accounts were closed with immediate effect / confirmed that I would not be held liable for any fraud that might have been committed (which again they wouldn't divulge) / promised to put those two confirmations in writing that day / asked me to 'log' the fraud with Action Fraud.
* Next few days brought a diminishing flow of letters from the banks (one even thanked 'me' for providing my voice to their recognition software and told me to use it to borrow more whenever I wanted!) ... thankfully alongside the promised letters excusing me from any involvement or liability.

But my point was that phoning Action Fraud resulted in a lengthy chat with a charming 'old school' agent who took copious notes and pointed me at the online form - which I duly completed. And the only thing I ever received from them subsequently? An email thanking me for my time but explaining that they were 'unable' to record it because I hadn't personally suffered a financial loss!

Thanks (7)
By Justin Bryant
06th Feb 2024 09:01

This is basically the fault of the banks in allowing the criminal recipient accounts in the 1st place without proper checks (a bit like CH). They seem more concerned about closing non-woke accounts of the likes of NF etc.

Thanks (3)
Replying to Justin Bryant:
Tom Herbert
By Tom Herbert
06th Feb 2024 09:29

That's what you're taking away from this story Justin?

Thanks (7)
Replying to TomHerbert:
By Justin Bryant
06th Feb 2024 10:19

Yes, as that's the bigger story i.e. any money transfer fraud relies on banks not doing their AML job properly basically. Banks should be massively fined for allowing these dodgy accounts in the first place, since without such dodgy accounts (or proper anti-fraud measures) such frauds would not be possible in the first place.

It's a bit like when Princess Diana died. The bigger story then was that all backseat passengers should wear a seat belt.

Thanks (6)
Replying to Justin Bryant:
By paul.benny
06th Feb 2024 12:59

Agree with Justin. It does appear that banks place more emphasis on ticking the compliance boxes rather than prevention of fraud.

Scary as the reported instance is, it relies on publicly available video, and since the fake video was supported by emails, it appears that the email account of the real CFO had been compromised. Ultimately, though the best precautions are professional scepticism by staff and good processes that should at least slow frauds like this.

And agree too about rear seat belts.

Thanks (5)
Replying to Justin Bryant:
By Justin Bryant
07th Feb 2024 11:35

A rare success story here where the bank was found culpable for the scam and had to compensate the victim:

If we had more of that sort of thing the banks would be incentivised to prevent such scams in the 1st place (rather than being preoccupied with their woke etc. agenda).

Thanks (5)
Replying to Justin Bryant:
By Justin Bryant
08th Feb 2024 12:11

Another recent example of this bigger story is here:

Thanks (0)
Replying to Justin Bryant:
By johnjenkins
08th Feb 2024 09:40

I've been saying this for years.

Thanks (2)
Replying to Justin Bryant:
By johnjenkins
08th Feb 2024 09:46

Client tried to open a bank account (limited company). They didn't agree that the SIC codes fully matched what the business (in their eyes) was doing so told client if he didn't change codes they couldn't open an account. Yet he has a personal account with them.
Of course the scammers are linked to banks. It's not just one account, the scammers have lists which (IMV) could only come from inside information.

Thanks (2)
By Runagood Team
08th Feb 2024 09:55

I foresee this forcing finance department employees back the office for face to face communication

Thanks (3)
Replying to Runagood Team:
By Nick Graves
08th Feb 2024 10:21

Notice how the 'recommendations' all involve more pointless tech.

There's a far easier way to verify a deepfake video; get an attack of Tourette's or try a non-sequitur and see what is the response.

I like to chat away to telephone scammers before laughing at them, so they have less time to try it on with someone else.

I also enjoyed that video of the guy who stalled by sounding moronic/distracted, whilst reverse-hacking the scammer's computer and deleting everything on it.

Thanks (1)
By Rob Swan
08th Feb 2024 10:35

Not at all surprised.
Unfortuneaty, the 'Human' is easily the most hackable device on the planet. And also one of the most stupid - no insult intended, just a fact. That's why this happens. I do feel sorry for the 'hacked' individual, but they must bear some responsibility, after all, they were suspicious...
If you run your own business - accountant, bookkeeper or client - and you (or an employee) doesn't eyeball and formally question/validate EVERY transaction (or bank rec.) you'll inevitably have to deal with this type of situation.
The accounting (software) industry's move towards AI in accounting (ie, invoice processing) is, for this very reason, a valid concern I think, and best avoided.
Given the current trend in world affairs this is only going to increase. How ready are you? And how are you advising your clients? #GoFundCriminalsAndDictatorships

Thanks (0)
By indomitable
08th Feb 2024 13:58

How can 1 person who appears to be a clerk have the authority to transfer money without at least one other signatory

It appears to me if the company concerned had strong internal controls this may not have happened

In the long dim distant past I worked as an FD in a largish multinational business

No finance clerk was allowed to send any payment anywhere

All payments above a very low level (£100) had to be physically sent via two signatories?

Thanks (1)
Replying to indomitable:
By johnjenkins
08th Feb 2024 14:08

Perhaps the clerk was in on it.

Thanks (0)