Deepfake scammer walks off with $25 million in first-of-its-kind AI heist::Hong Kong firm tricked by simulation of multiple real people in video chat, including voices.

  • @[email protected]
    link
    fedilink
    English
    26
    edit-2
    9 months ago

    Acting senior superintendent Baron Chan Shun-ching of the Hong Kong police emphasized the novelty of this scam, noting that it was the first instance in Hong Kong where victims were deceived in a multi-person video conference setting. He pointed out the scammer’s strategy of not engaging directly with the victim beyond requesting a self-introduction, which made the scam more convincing.

    The police have offered tips for verifying the authenticity of individuals in video calls, such as asking them to move their heads or answer questions that confirm their identity, especially when money transfer requests are involved. Another potential solution to deepfake scams in corporate environments is to equip every employee with an encrypted key pair, establishing trust by signing public keys at in-person meetings. Later, in remote communications, those signed keys could be used to authenticate parties within the meeting.

    If you’re a rank-and-file employee in a virtual meeting with your company’s top brass, it probably won’t occur in your mind to ask them to turn their heads to see if it’ll glitch. The scammers can just act offended and ignore your request instead. Chance that you’re going to fear for your employment and apologize profusely.

    The key exchange mechanism suggested by the article sounds impractical because the employees from HK likely never meet the CFO from UK in person. Maybe the corporate video conferencing system should have a company-wide key registry, but if the scammers managed to hack in and insert their own key or steal a top brass’s video conferencing accounts, then it’ll probably moot.