Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Hong Kong

Scammers trick firm out of $26 million by impersonating senior executives using deepfakes

Law enforcement agencies are scrambling to keep up with generative artificial intelligence

SCAMMERS TRICKED A multinational firm out of some $26 million (€30.4 million) by impersonating senior executives using deepfake technology, Hong Kong police said, in one of the first cases of its kind in the city.

Law enforcement agencies are scrambling to keep up with generative artificial intelligence, which experts say holds potential for disinformation and misuse – such as deepfake images showing people mouthing things they never said.

A company employee in the Chinese finance hub received “video conference calls from someone posing as senior officers of the company requesting to transfer money to designated bank accounts”, police told AFP.

Police received a report of the incident on 29 January, at which point some HK$200 million (€30.4 million) had already been lost via 15 transfers.

“Investigations are still ongoing and no arrest has been made so far,” police said, without disclosing the company’s name.

The victim was working in the finance department, and the scammers pretended to be the firm’s UK-based chief financial officer, according to Hong Kong media reports.

Acting Senior Superintendent Baron Chan said the video conference call involved multiple participants, but all except the victim were impersonated.

“Scammers found publicly available video and audio of the impersonation targets via YouTube, then used deepfake technology to emulate their voices… to lure the victim to follow their instructions,” Chan told reporters.

The deepfake videos were pre-recorded and did not involve dialogue or interaction with the victim, he added. 

Your Voice
Readers Comments
20
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel