The use of deep fake video and audio technologies could become a major cyber threat to businesses within the next two years, according to cyber analytics specialist CyberCube.
The ability to create realistic audio and video fakes using AI and machine learning has grown steadily. An increased dependence on video-based communication has increased the supply of data from which to build photo-realistic simulations of individuals, which can then be used to influence and manipulate people.
In addition, ‘mouth mapping’ -- a technology created by the University of Washington -- can be used to mimic the movement of the human mouth during speech with extreme accuracy. This complements existing deep fake video and audio technologies.
CyberCube’s head of cyber security strategy Darren Thomson, said: “As the availability of personal information increases online, criminals are investing in technology to exploit this trend. New and emerging social engineering techniques like deep fake video and audio will fundamentally change the cyber threat landscape and are becoming both technically feasible and economically viable for criminal organisations of all sizes.
“Imagine a scenario in which a video of Elon Musk giving insider trading tips goes viral -- only it’s not the real Elon Musk. Or a politician announces a new policy in a video clip, but once again, it’s not real. We’ve already seen these deep fake videos used in political campaigns; it’s only a matter of time before criminals apply the same technique to businesses and wealthy private individuals. It could be as simple as a faked voicemail from a senior manager instructing staff to make a fraudulent payment or move funds to an account set up by a hacker.”
The report warns insurers that there is little they can do to combat the development of deep fake technologies but stresses that risk selection will become increasingly important for cyber underwriters.
Insurers should also consider the potential of deep fake technology to create large losses as it could be used in an attempt to destabilise a political system or a financial market.
In March 2019, cyber criminals used AI-based software to impersonate a chief executive’s voice to demand the fraudulent transfer of US$243,000.
Thomson added: “There is no silver bullet that will translate into zero losses. However, underwriters should still try to understand how a given risk stacks up to information security frameworks. Training employees to be prepared for deep fake attacks will also be important.”
Printed Copy:
Would you also like to receive CIR Magazine in print?
Data Use:
We will also send you our free daily email newsletters and other relevant communications, which you can opt out of at any time. Thank you.
YOU MIGHT ALSO LIKE