Arabic Arabic Chinese (Simplified) Chinese (Simplified) Dutch Dutch English English French French German German Italian Italian Portuguese Portuguese Russian Russian Spanish Spanish
| (844) 627-8267

Voice cloning cybercrime on the rise | #cybercrime | #computerhacker

You can also listen to this podcast on here.

JEREMY MAGGS: Let’s end the programme with this story. Using artificial intelligence tools to clone voices has introduced a novel realm of risk for both companies and individuals. Take a listen to this. The immense potential of generative AI is being exploited by cybercriminals who’ve harnessed it for malicious purposes and have created convincing deep fakes and perpetuating what is termed unnervingly realistic voice scams. Are you worried?

Well, we should be. The scammers are incredibly clever, says Stephen Osler, who’s co-founder and business development director at Nclose, which is a cybersecurity company. He joins us now on the programme. Stephen, how prevalent then have voice cloning scams become?

STEPHEN OSLER: Hi, Jeremy. Look, we’ve seen them happen a little bit from a consumer perspective or shall I say, the numbers are definitely increasing. We have encountered one or two instances when corporates are being defrauded using voice scams and I think those numbers are just going to gradually increase.

Read: AI scam calls imitating familiar voices are a growing problem

JEREMY MAGGS: So it’s something to be concerned about. How does it work? How do they manifest themselves?

STEPHEN OSLER: So essentially what they’re doing is they’re using applications that are easily downloaded [from] a traditional app store. They’re using a voice recording in order to clone a particular voice and then they’re using that as part of their, for all intents and purposes, their fraudulent attack or their cyberattack.

So what we’re generally finding is they’re using the recorded voice in order to gain credibility with the end goal.

So what we see specifically in cyberattacks or fraudulent attacks, what these threat hunters are doing or these threat actors are doing, all they’re trying to do is try and gain additional credibility in order to make somebody either click on a link or pay a fraudulent transaction, and having a voice note as opposed to a WhatsApp message or an email is getting that credibility.

JEREMY MAGGS: Ja, that’s absolutely right. That’s the point I want to make is that voice note always has a degree more of authenticity, doesn’t it?

STEPHEN OSLER: That’s absolutely it, it’s authenticity. I think that’s the problem that we see. Authenticity is probably the one thing that everybody questions when they’re doing a transaction that is slightly sketchy and these defrauders or these cyber criminals are really trying to leverage this generative AI or deep fake cloning technology in order to take that next step.

Read: Nine British banks sign up to new AI tool for tackling scams

JEREMY MAGGS: Alright. So having made us all very concerned, you’ll also tell me that often this is targeted at C-suite executives. So what kind of countermeasures are immediately available? What strategies should companies be adopting?

STEPHEN OSLER: Jeremy, we saw a lot of fraudulent financial attacks against particular industries where email accounts were compromised, and we found the fraudsters were sending out emails with invoices and changing your bank details. Obviously, we are seeing that pivot because companies have introduced good governance.

I think what needs to happen specifically for organisations, they really need to make sure that those governance procedures are well bedded down because in the grand scheme of things, that is the only thing that has the necessary checks and balances in place that could mitigate this type of fraudulent attack.

These cybercriminals will more than likely pivot once they realise that the governance procedures are in place.

But you should not be paying any invoices just on receiving a voice note from your C-level executive. I think that’s the big trick.

JEREMY MAGGS: But it’s not just aimed at companies, it’s individuals as well, and voices being extracted from platforms that we use every single day, like Facebook Messenger and WhatsApp.

STEPHEN OSLER: Absolutely. We’ve seen them happen in two cases. Firstly, people are using them to scam individuals by saying that their relatives are being hijacked or they’ve been held ransom and getting people to pay ransoms for their release. We’ve seen a few instances of that.

We’ve seen instances where a user’s trying to sell something or buy something on Facebook marketplace and the fraudster is using the voice notes in order to get credibility to either get them to pay a deposit or pay an amount or get them to meet somewhere. So really these guys are being quite tricky in their approach.

I think the big thing that we need to just be careful of, if it’s too good to be true, then it generally is.

I think the one easy way to mitigate this type of attack is just pick up the phone and phone the individual that you perceive to be receiving the voice note from.

JEREMY MAGGS: But never underestimate all of our naivety in this space as well. Often, we just do it because we do it, I guess. The current technology also, it’s starting to create or recreate tone or idiosyncrasies of an individual’s voice and that makes it more worrying, doesn’t it?

STEPHEN OSLER: It absolutely makes it more worrying, not just the tone or the idiosyncrasies but also, us in South Africa that we have a multitude of different languages and also ways that we talk, and these applications are able to mimic that. One thing is if you were to listen very carefully, we find a lot of these voice notes are slightly distorted or you’ll find that the information that the person is articulating back to you, or shall I say the voice note is articulating back to you, isn’t often 100% correct, because what these fraudsters are doing is they’re using AI or ChatGPT to create the actual conversation piece and using the cloning technology to actually generate the piece of audio. So sometimes the AI technology is not 100% in terms of what they’re explaining is not 100% accurate.

JEREMY MAGGS: In any case, though, you’ve given us an enormous amount of food for thought. Stephen Osler, thank you very much indeed, co-founder and business development director at Nclose.

R2.7m ‘fees’ for R1m loan: How Cape farmer got scammed
Cybercrime hits Toyota dealership
More than half of crypto users in SA hit by cybercrime – report


Click Here For The Original Source.