AI made crypto scams far more dangerous

AI made crypto scams far more dangerous

The first half of 2025 saw one of the worst waves of crypto hacks to date, with more than $3.01 billion stolen. AI was a big part of it, making scams easier to run and letting even low-skill criminals get in on the action.

In the U.S. alone, nearly 160,000 crypto-related fraud complaints were reported in 2024.

“The adversaries themselves aren’t fundamentally different between traditional finance and the crypto industry, but certain of the tactics they employ are distinct and the sophistication of attackers in the crypto space is notably higher,” said Norah Beers, CISO at Grayscale.

AI is fueling a new wave of attacks

Attackers use AI to analyze data from social media, online forums, and blockchain transactions. By correlating data across these sources, they can detect patterns and choose potential victims for phishing or impersonation campaigns. Fake websites, social media accounts, and videos have become so realistic that it is difficult to determine their authenticity.

Some criminals use fake trading bots to display false profits or give misleading signals, encouraging users to deposit funds or follow bad financial advice. Entire trading platforms or apps can be built around fake algorithms that promise high returns but steal deposited crypto. Scammers can bypass KYC checks using generated images or credentials.

Bots appear in crypto communities on Discord and Telegram, impersonating moderators or project admins to trick users into sharing wallet details or clicking malicious links. Others mimic support agents in live chats to steal login credentials or recovery phrases.

Long-term scams, known as pig butchering, involve building trust over weeks or months before convincing victims to invest large sums in fake platforms.

Researchers have found cybercriminals selling deepfake tools and services on forums, social media, and messaging platforms. These tools let users create fake audio, video, and images, including face swaps and deepfake videos. Prices vary: face-swapping services like Swapface cost between free and $249 per month, while custom deepfake videos usually range from $60 to $500, depending on complexity and quality.

A single attacker can now use AI to create and manage thousands of phishing messages, fake support agents, or investment bots.

Deepfake crypto scams target TikTok and YouTube users

These scams often operate under a mask of legitimacy, using deepfake videos of well-known figures like Elon Musk, Mr. Beast, or Donald Trump to lure users into fraudulent cryptocurrency schemes. They appear mainly on TikTok and YouTube, which makes sense since both platforms have billions of active users.

The National Cyber Security Centre (NCSC) reported an AI-assisted crypto scam on YouTube. The channel featured a likely AI-generated crypto expert and added over 100,000 followers in one day. Videos instructed viewers to run code claiming to activate developer mode in TradingView (charting and trading platform), but it installed malware that stole passwords, email access, and crypto wallet contents.

New York authorities recently froze $300,000 in stolen cryptocurrency and shut down more than 100 scam websites tied to a Vietnam-based group that targeted Russian-speaking residents in Brooklyn with fake Facebook investment ads.

The consequences of these scams are not only financial. Over time, people begin to lose trust and question the security of crypto exchanges. Investors will think twice about investing if they cannot distinguish legitimate communications from sophisticated deepfake impersonations.

How can crypto firms avoid scams

  • Use multiple layers of security like firewalls, DDoS protection, and detection systems that can spot threats early.
  • Follow KYC and AML rules to reduce fraud.
  • Monitor transactions in real time to catch unusual activity such as large withdrawals or suspicious deposits.
  • Keep your systems updated to fix vulnerabilities, and check all third-party vendors and smart contracts.
  • Teach your users how to stay safe so they are less likely to fall for scams. Back up your data regularly and make sure it is encrypted. Practice your recovery plans often to make sure you can restore data if something goes wrong.
  • Talk openly with your users about security concerns. A well-informed community can help protect itself.
  • Test your systems regularly by doing penetration tests and ethical hacking to find weak spots.
  • Train all employees on the latest security risks and safe practices.
  • Work with other exchanges to share information about threats and bad actors, so everyone can improve their defenses together.


Source link

About Cybernoz

Security researcher and threat analyst with expertise in malware analysis and incident response.