The faking of photographs has been around for some time, but faking voices is becoming a sinister development for fraudsters.
Take this example:
In late 2019, a UK energy firm’s CEO was purportedly on the phone with his boss at the German parent company. The boss asked him to send €220,000 to a supplier in Hungary immediately, stating it was urgent and necessary for the firm. He even mentioned a reimbursement at a later date. Convinced by the urgency and familiarity of his boss’s voice, the CEO obliged, transferring the funds.
However, it later transpired that it was not his boss on the other end of the line at all, but a sophisticated AI system that used deepfake technology to imitate the boss’s voice. The company had been swindled in one of the first publicly reported instances of a scam using deepfake audio technology — no doubt there will be more to come.
It is important to review your policies and practices to establish checks and balances that do not rely on a familiar voice to authorise transactions.
View all blog posts here.