pageview
Banner Default Image

AI mimics CEO voice to scam UK energy firm out of £200k

over 4 years ago by Lucy Cinder

AI mimics CEO voice to scam UK energy firm out of £200k

Artificial Intelligence

Criminals have impersonated a chief executive’s voice using artificial intelligence-based software to deceive an unnamed UK-based CEO into making a fraudulent transfer of £200,000 according to a report in the WSJ. 

WSJ’s Catherine Stupp reported that:  "The CEO of a UK-based energy firm thought he was speaking on the phone with his boss, the chief executive of the firm’s German parent company, who asked him to send the funds to a Hungarian supplier. The caller said the request was urgent, directing the executive to pay within an hour, according to the company’s insurance firm, Euler Hermes Group SA. Euler Hermes declined to name the victim companies." Rüdiger Kirsch, a fraud expert at Euler Hermes, a subsidiary of Munich-based financial services company Allianz SE was reported as saying that the UK CEO recognised his boss’ slight German accent and the melody of his voice on the phone. The CEO made the requested transfer to a Hungarian supplier and was contacted again with assurances that the transfer was being reimbursed immediately.

In an email to SC Media UK, Jake Moore, a cyber-security specialist at ESET commented: "I predict that we will see a huge rise in machine-learned cyber-crimes in the near future. We have already seen DeepFakes imitate celebrities and public figures in video format, but these have taken around 17 hours of footage to create convincingly. Being able to fake voices takes fewer recordings to produce. As computing power increases, we are starting to see these become even easier to create, which paints a scary picture ahead.

"To reduce risks it is imperative not only to make people aware that such imitations are possible now but also to include verification techniques before any money is transferred. Two-factor authentication is another powerful, inexpensive and simple technique that adds an extra layer of security to protect your money going into a rogue account. When being called about a money transfer, particularly of large sums, check the number calling and ask to call back. Do so using a number in your address book, rather than hitting the "call back" option in your call history."

Stu Sjouwerman, CEO at KnowBE4 commented in his blog  "This is essentially the next step up in the escalation of using social engineering in a case of CEO Fraud. You need to step your employees through new-school security awareness training to prevent human errors like this."

Relevant to both this story and news reports about  Chinese deepfake app Zao, Matt Aldridge, senior solutions architect at Webroot emailed SC Media UK to add:"Deepfake-style technology – where completely believable video and/or audio of a person can be generated or swapped out from other media –  is extremely dangerous.  It is already being used in highly effective, targeted spearphishing campaigns and this will only continue as the technology allows the stakes to get higher.  The scope for disinformation at a nation-state level is also dramatically concerning. Fake news will become infinitely more difficult to differentiate from real stories and real videos. It is a magic bullet for any authoritarian state, organisation or terrorist group that wishes to recruit members or sway opinions to achieve its own goals.

"A future of widespread distrust is coming. We may think that we’re having a video call with a close colleague or a loved one, but the other party is actually an imposter. We need to start preparing for this now and understand how we can ensure that our communications are all real and secure."

source scmagazineuk

Industry: Cyber Security & Artificial Intelligence

Banner Default Image

Latest Jobs