‘Cybercriminals are using AI to level up online attacks’ expert warns as phone calls from loved ones now easier to fake | 51D388U | 2024-02-28 15:08:01

New Photo - 'Cybercriminals are using AI to level up online attacks' expert warns as phone calls from loved ones now easier to fake | 51D388U | 2024-02-28 15:08:01
'Cybercriminals are using AI to level up online attacks' expert warns as phone calls from loved ones now easier to fake | 51D388U | 2024-02-28 15:08:01

We spoke to James McQuiggan, a security consciousness professional at KnowBe4, concerning the rising dangers of

ARTIFICIAL intelligence helps cybercriminals create even more convincing scams that can drain your checking account.

We spoke to James McQuiggan, a security consciousness professional at KnowBe4, concerning the rising dangers of cybercrime and how you can keep away from them.

'Cybercriminals are using AI to level up online attacks' expert warns as phone calls from loved ones now easier to fake
'Cybercriminals are using AI to level up online attacks' expert warns as phone calls from loved ones now easier to fake
Getty
AI is claimed to be helping cybercriminals create far more refined scams[/caption]

"AI is significantly enhancing the sophistication of on-line scams and social engineering and growing the sophistication of romance scams and deepfake voice scams.

"With the power to generate highly real looking and personalised content, scammers can now create convincing deepfake audio recordings in real-time with audio and video from social media sources, all in an try to control the audio to deceive their targets into believing they're taking a good friend or liked one," McQuiggan stated in an interview with The U.S. Sun.

"This motion poses a big problem for individuals and organizations alike, as traditional strategies of detecting fraud might not be enough in the face of AI-generated scams.

"Cybercriminals are utilizing AI to degree up their on-line attack methods."

The power to clone somebody's voice and use it in a telephone call scam is now so easy that refined criminals solely want about three seconds of audio to do it.

Even a brief three-second clip can recreate your voice with 70% accuracy, in accordance with specialists at McAfee.

Fortunately, there are still some telltale indicators to search for once you're corresponding with an AI scammer.

"Search for unnatural or repetitive language patterns, unusually fast responses, and a lack of know-how or empathy in the conversation," McQuiggan advised.

"Moreover, asking particular, open-ended questions that require contextual understanding might help reveal whether you're partaking with a human or an AI.

"The cybercriminals can be responding in real-time or with staged messages."

                    <!-- End of Brightcove Player -->  

"By asking random questions or asking for a predetermined code phrase, cybercriminals can simply detect that they are utilizing deepfake applied sciences to attack the victim," he added.

The skilled also flagged deepfake movies as a pressing concern.

"The rise in AI-generated movies is poised to gasoline the prevalence of deepfake scams.

"As AI turns into more adept at creating reasonable and compelling movies in real-time, the potential for malicious actors to take advantage of this know-how for fraudulent purposes grows.

"It presents a pressing concern for people and businesses as the danger of falling victim to stylish deepfake scams escalates," he warned.

#cybercriminals #using #ai #level #attacks #expert #warns #phone #calls #loved #ones #now #easier #fake #US #UK #NZ #PH #NY #LNDN #Manila #Games

More >> https://ift.tt/GyYX9V6 Source: MAG NEWS

 

PYN MAG © 2015 | Distributed By My Blogger Themes | Designed By Templateism.com