Arabic Arabic Chinese (Simplified) Chinese (Simplified) Dutch Dutch English English French French German German Italian Italian Portuguese Portuguese Russian Russian Spanish Spanish
| (844) 627-8267

AI to amplify global ransomware threat, warns GCHQ | #ransomware | #cybercrime | #hacking | #aihp

The National Cyber Security Centre (NCSC), a division of GCHQ, has raised the alarm about the escalating global ransomware threat driven by the integration of artificial intelligence (AI) in cyber attacks.

In a report released today (24 January), the NCSC underscores the immediate impact of AI on cyber threats, asserting that it is currently being used in malicious cyber activities and is poised to substantially escalate both the volume and impact of cyber attacks, including ransomware, in the next two years.

Among the key findings, the report highlights AI’s role in lowering the entry barrier for novice cybercriminals, hackers-for-hire, and hacktivists. This facilitates more effective access and information gathering operations by relatively unskilled threat actors.

The enhanced access, coupled with AI’s improved victim targeting capabilities, is expected to significantly increase the global ransomware threat in the near future.

Ransomware remains the most pressing cyber threat for UK organisations and businesses, with cyber criminals adapting their tactics to gain efficiencies and maximise profits.

The UK Government has allocated £2.6bn under its Cyber Security Strategy to enhance the country’s resilience against cyber threats, with the NCSC and private industry incorporating AI to bolster cyber security through advanced threat detection and security-by-design.

The Bletchley Declaration, a landmark announcement made at the UK’s AI Safety Summit in November 2023, outlines a global effort to manage the risks associated with frontier AI and ensure its safe and responsible development.

In the UK, the AI sector, already employing 50,000 people and contributing £3.7bn to the economy, is a focal point of government initiatives to ensure economic growth aligns with technological advancements.

Lindy Cameron, CEO of the NCSC, emphasised the need to harness AI’s potential while managing its risks. She stated: “The emergent use of AI in cyber attacks is evolutionary, not revolutionary, enhancing existing threats like ransomware but not transforming the risk landscape in the near term.”

Analysis from the National Crime Agency (NCA) reveals that cybercriminals are already developing criminal Generative AI (GenAI) and offering GenAI-as-a-service, making improved capability accessible to those willing to pay.

However, the NCSC’s report highlights that the effectiveness of GenAI models is limited by both the quantity and quality of the data on which they are trained.

The report echoes warnings from a joint report by the NCSC and NCA in September 2023, describing the professionalisation of the ransomware ecosystem and a shift towards the ransomware-as-a-service model.

Access the most comprehensive Company Profiles
on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free

Thank you!

Your download email will arrive shortly

We are confident about the
quality of our Company Profiles. However, we want you to make the most
decision for your business, so we offer a free sample that you can download by
submitting the below form

By GlobalData

According to the NCA, ransomware is unlikely to be replaced by another method of cybercrime in 2024 due to its financial rewards and established business model.

James Babbage, director general for threats at the NCA, commented, “Ransomware continues to be a national security threat. As this report shows, the threat is likely to increase in the coming years due to advancements in AI and the exploitation of this technology by cybercriminals.”

The report outlines additional ways in which AI will impact the effectiveness of cyber operations and the cyber threat over the next two years, including social engineering and malware.

Click Here For The Original Source.