
4 Latest Fraud Modes Using
AI, Already Many Victims
Jakarta, CNBC Indonesia –Last year, there were many frauds using deepfake, voice cloning, and AI-based phishing scams. However, it turns out that all of that was just a “warm-up” for cybercriminals.
The year 2025 is predicted to be an era where AI-based fraud becomes the main force that drains funds from fintech to bank accounts. The reason is, criminals now have new weapons that are very sophisticated and difficult to detect.
The latest report from Forbes states that AI technology is not only used for productive purposes, but has also become a new weapon for global fraud syndicates.
Here are four AI fraud modes that the public and corporations must be aware of:
1. Deepfake & AI in Business Email Attacks (BEC)
BEC fraud is now evolving. Cybercriminals are using AI to create very convincing fake videos and audio. In Hong Kong, criminals managed to impersonate the company boss through a fake Zoom call and made employees transfer almost IDR 480 billion in funds.
More surprisingly, 53% of accounting professionals in the US admitted to having been the target of a similar attack. Even 40% of BEC emails are now created entirely by AI.
2. Romance Fraud Chatbots
Romance fraud is now increasingly sophisticated. No longer humans, but autonomous AI chatbots are used to seduce victims. With accentless conversations and natural flow, victims find it difficult to distinguish between humans and bots.
This incident has appeared on social media, and was even leaked by a Nigerian criminal in a video.
3. “Pig Butchering” Using Mass AI
The investment fraud scheme disguised as romance or business, known as “pig butchering”, is now carried out en masse using AI.
With tools such as “Instagram Automatic Fans”, mass messages are sent to lure victims, such as “My friend recommended you. How are you?”
Fraudsters are now also using deepfakes for video calls and voice cloning to make them more convincing.
4. Deepfake Blackmail Targets Executives and Officials
Cases of blackmail using deepfake videos are also increasingly rampant. In Singapore, criminals are sending emails threatening fake videos that impersonate government officials and demand tens of thousands of dollars in crypto payments.
The technology is built using public photos and videos from LinkedIn or YouTube, which are then processed into gruesome deepfake content.
With deepfake software becoming more accessible, this type of fraud is expected to spread to executives around the world.SOURCE : CNBC INDONESIA