SINGAPORE - As students and workers harness generative artificial intelligence tools for their studies and work, so, too, have crooks.
FraudGPT is marketed on the Dark Web as a tool to learn how to hack, and write malware and malicious code. WormGPT wasRoughly 13 per cent of phishing scams in 2023 analysed by CSA showed signs that they were likely made with AI. CSA wrote: “Threat actors can use AI to scrape social media profiles and public websites for personally identifiable information, thereby increasing the speed and scale of highly personalised social engineering attacks.”
Another way the technology is deployed maliciously is in the generation of deepfake images to bypass biometric authentication. For example, to beat the use of facial recognition as a security feature, fraudsters turn to face-swopping apps. Deepfake scams have come a long way since one of the earliest incidents surfaced in 2019, when The Wall Street Journal reported that a chief executive of a Britain-based energy firm wasDeepfakes have only grown more convincing since then, said CSA. It cited a case in 2024 where an employee from a multinational firm was tricked intoWith AI now driving cyber attacks, best to fight fire with fire
“Through machine learning and algorithms, AI can be trained to detect deepfakes, phishing e-mails and suspicious activities,” said CSA.