CyberCube warns of major potential deep fake risks

The use of deep fake video and audio technologies could become a major cyber threat to businesses within the next two years, according to cyber analytics specialist CyberCube.

The ability to create realistic audio and video fakes using AI and machine learning has grown steadily. An increased dependence on video-based communication has increased the supply of data from which to build photo-realistic simulations of individuals, which can then be used to influence and manipulate people.

In addition, ‘mouth mapping’ -- a technology created by the University of Washington -- can be used to mimic the movement of the human mouth during speech with extreme accuracy. This complements existing deep fake video and audio technologies.

CyberCube’s head of cyber security strategy Darren Thomson, said: “As the availability of personal information increases online, criminals are investing in technology to exploit this trend. New and emerging social engineering techniques like deep fake video and audio will fundamentally change the cyber threat landscape and are becoming both technically feasible and economically viable for criminal organisations of all sizes.

“Imagine a scenario in which a video of Elon Musk giving insider trading tips goes viral -- only it’s not the real Elon Musk. Or a politician announces a new policy in a video clip, but once again, it’s not real. We’ve already seen these deep fake videos used in political campaigns; it’s only a matter of time before criminals apply the same technique to businesses and wealthy private individuals. It could be as simple as a faked voicemail from a senior manager instructing staff to make a fraudulent payment or move funds to an account set up by a hacker.”

The report warns insurers that there is little they can do to combat the development of deep fake technologies but stresses that risk selection will become increasingly important for cyber underwriters.

Insurers should also consider the potential of deep fake technology to create large losses as it could be used in an attempt to destabilise a political system or a financial market.

In March 2019, cyber criminals used AI-based software to impersonate a chief executive’s voice to demand the fraudulent transfer of US$243,000.

Thomson added: “There is no silver bullet that will translate into zero losses. However, underwriters should still try to understand how a given risk stacks up to information security frameworks. Training employees to be prepared for deep fake attacks will also be important.”

    Share Story:

Recent Stories

Are property insurers ready for timber
The Structural Timber Association is gearing up to help all stakeholders in the construction supply chain to fully appreciate the advantages of building in timber, how to deliver such projects and most importantly to understand and manage the risks.

The changing face of BC and WAR
The working environment has changed quite dramatically for many over the last six months. With social distancing and the rise of homeworking, it is not just how businesses operate that has changed, but also how they recover. In this podcast we discuss some of the challenges created by the quick shift to home working, why the office may not have seen its last days and how the current environment can impact the ability of a business to recover.