Deepfake CFO dupes Hong Kong business into £20m paymentby
In one of the most elaborate cyber heists ever revealed, criminals used multiple fake videos and artificial intelligence-generated voices to trick a finance worker into making payments totalling £20m.
Fraudsters used deepfake video technology to trick a finance worker at a multinational company into believing they were speaking with the chief finance officer – and ultimately into paying out more than HK$200m (£20m).
South China Morning Post reported that last month a finance clerk at the Hong Kong branch of the unnamed multinational was invited to a video conference via a message purporting to be from the company’s chief financial officer (CFO).
As part of a campaign to raise awareness of new scam tactics, Hong Kong police senior superintendent, Baron Chan Shun-ching told journalists this raised the anonymous employee’s suspicions as it mentioned that secret transactions that needed to be carried out. However, while on the video conference, the employee was joined by what looked and sounded like their colleagues and was reassured enough to continue the call.
Unfortunately for the finance worker, they were the only real person on the video. The CFO and colleagues’ videos were reportedly created using genuine, publicly available video conferences from the past, with audio dubbed in.
“I believe the fraudster downloaded videos in advance and then used deepfake technology to add fake voices to use in the video conference,” said Chan.
Deepfakes use artificial intelligence (AI) tools to create highly convincing fake videos or audio recordings and enable one person’s likeness to be swapped for another.
The deepfake videos were pre-recorded and did not involve interaction with the victim, said Chan. He added that the criminals also used WhatsApp and email messages to communicate with the finance worker to add legitimacy to the con.
The finance worker eventually agreed to send 15 transactions worth a total of HK$200m (£20.2m) to five local bank accounts as instructed.
The scam was only picked up later after the employee spoke to the company’s head office six days later.
New deception tactics
“We want to alert the public to these new deception tactics. In the past, we would assume these scams would only involve two people in one-on-one situations, but we can see from this case that fraudsters are able to use AI technology in online meetings, so people must be vigilant even in meetings with lots of participants,” Chan said.
He added that the criminals’ strategy of not engaging directly with the victim beyond requesting a self-introduction made the scam more convincing.
CNN reports that the case is one of a number that the Hong Kong police are investigating where scammers have used deepfake technology to modify publicly available video and other footage to trick people or businesses out of money. At a press briefing last week, they revealed that they had made six arrests relating to similar cases – although, in the fake CFO case, there have been no arrests at present.
In the UK last year finance expert Martin Lewis warned people not to fall victim to a deepfake video that appeared to show him endorsing a fake investment project from Elon Musk, calling the technology “absolutely terrifying”.
Is seeing really believing anymore?
“The old adage ‘seeing is believing’ is deeply ingrained in us, but our visual trust is quickly becoming our weakness,” Stephen Edington, chief product and technology officer at Dext, told AccountingWEB. “While we might doubt words, seeing an event in video form tends to have a stronger impact due to our reliance on visual cues.
“However, technology that can fabricate videos is now readily available at a relatively low cost, challenging our instincts to trust, especially if the content appears to come from a credible source or aligns with our biases,” he continued.
“Pretty soon, confirming your humanity and identity is going to pose significant challenges. In my opinion, biometrics and cryptography are the only viable options to help establish digital trust. Future online identities and communications will be verified through biometric keys. This level of security will become crucial, and tech giants along with governments will play a key role in implementation. Unsigned digital content may soon be disregarded, while verified interactions become the norm. In the interim we are in for an interesting ride, believe it when you see it – in person.”
Francis West, founder of cybersecurity firm Security Everywhere, told AccountingWEB that protecting businesses, especially those in the financial sector, from deepfake scams is becoming increasingly important. He provided a few tips on things accountants and finance professionals may wish to consider if they wish to mitigate the risks associated with deepfake scams.
- Employee training and awareness
- Conduct regular training sessions on recognising deepfake content.
- Educate employees about the potential risks and consequences of falling for deepfake scams.
- Encourage a culture of scepticism when receiving video or audio requests, especially those related to financial transactions.
- Two-factor authentication (2FA)
- Implement 2FA for all financial transactions and access to sensitive systems.
- Require the use of strong, unique passwords in combination with 2FA.
- Verification protocols
- Establish a verification process for high-value transactions or requests received via video or voice.
- Ensure that video or voice requests are confirmed through a separate channel, such as a phone call or face-to-face communication.
- Document and record
- Keep detailed records of financial transactions and communications.
- Archive video or audio interactions for future reference.
- Encryption and secure communication
- Use secure communication channels and encryption for sensitive financial information.
- Verify the authenticity of communication channels before sharing sensitive data.
- Identity verification
- Implement strict identity verification procedures for clients or customers requesting financial transactions.
- Use multi-step verification processes for significant transactions.
- Stay informed
- Keep abreast of the latest developments in deepfake technology and cybersecurity.
- Adapt security measures accordingly to stay ahead of potential threats.
- Third-party security audits
- Regularly assess the cybersecurity measures of third-party vendors or service providers.
- Ensure they have robust safeguards against deepfake-related fraud.
- Emergency response plan
- Develop a clear and well-documented plan for responding to suspected deepfake scams.
- Assign roles and responsibilities for addressing security incidents promptly.
- Collaboration with cybersecurity experts
- Consult with cybersecurity experts to evaluate and enhance your organisation’s defences against deepfake threats.
- Consider employing deep learning-based AI tools to detect deepfake content.
If you are a UK reader and think you may have been a victim of fraud or cyber crime, report it to Action Fraud at www.actionfraud.police.uk or, if you live in Scotland, to Police Scotland by calling 101.