phone-red Call us about IT Support in London 020 7572 0000

Deep fake technology likely to play major future role in cyber crime

Byte-size Bulletin by Rachael Brown in Security on Aug 3, 2021

deep fake voice

According to experts, deepfake technologies including voice cloning will play a major role in future cyber criminal activity.  

Voice cloning is where computer Artificial Intelligence (AI) software is used to generate an adaptable, synthetic copy of a person’s voice.  

All these programs need is as little as a ten-minute recording of someone speaking to generate an accurate copy. With that, the AI software memorises and replicates the user's timbre, pitch, intensity and thousands of other distinctive vocal features. 

Much has been discussed about the commercial and political repercussions of deepfake technology. Many raising alarm bells over how it will feed into our current ‘post truth’ climate by allowing users to propagate manufactured realities and events.  

Less attention has been paid to the role voice cloning will play in cyber crime, despite the fact it’s already becoming apparent.  

Cyber security expert Eddy Bobritsky, CEO of Israeli cyber security firm Minerva Labs, says voice cloning constitutes a "huge security risk" as it can be used to violate a means of communication that until now was seen as indisputably secure, especially compared to emailing or texting, talking on the phone.  

"But until now, talking on the phone with someone you trust and know well was one of the most common ways to ensure you are indeed familiar with the person." Mr Bobritsky explained. 

“If a boss phones their employee asking for sensitive information, and the employee recognises the voice, the immediate response is to do as asked. It's a path for a lot of cybercrimes." 

This exact situation occurred back in 2019, where the CEO of a UK-based energy firm was tricked by a fraudster impersonating his German boss, CEO of the firm’s parent company, on the phone. The energy CEO was tricked into transferring $220,000 to the fraudster, under the guise of it being an urgent business transaction.  

Voice cloning being utilised by cyber criminals is not just a one-off anecdotal situation, however. In 2020, Pindrop, a company that creates security software for call centres, reported a 350% rise in voice fraud between 2013 and 2017, primarily to card issuers, insurers, brokerages, credit unions, and banks.  

A fraudster with computer AI that can accurately clone a victim's voice has an unprecedented ability to access their private information due to our newfound reliance on audio technologies. For example, many today access their banking services through reciting a voice key over the phone, a process that can easily be hijacked by voice cloning technologies.  

If used in this way, voice cloning operates as a more sophisticated form of phishing, likelier to have higher rates of success than its email and text counterparts.  

As the tools to create fakes improve, the chances of criminals using AI-based voice tech to mimic our voices and use them against us are heightened.

Meaning businesses need to stay aware of emerging cyber security risks and invest in multidimensional forms of IT security, that account for the myriad methods cyber criminals utilise to exploit others.  

Photo by Kelsey Curtis on Unsplash

Subscribe to our Bulletins





Free Download

Is IT a bottleneck to your company’s growth?

Discover how small business IT support can be a strong ally in making you more productive and competitive.

Download Ebook

bottlenecks