Navigation
Join our brand new verified AMN Telegram channel and get important news uncensored!
  •  

‘This is fake’ — How North Korea uses AI and deepfakes as a weapon

The Army's artificial intelligence software prototype designed to quickly identify threats through a range of battlefield data and satellite imagery. (Photo Screenshot image/US Army)
October 06, 2025

This article was originally published by Radio Free Asia and is reprinted with permission.

RFA Perspectives — Deepfake and AI videos are created from tools anyone can download. North Korean hackers are already using the same tools as a weapon.

Recently, South Korea’s cybersecurity firm Genians revealed that a North Korean hacking group used AI-generated deepfake military IDs to impersonate defense agencies and launch phishing attacks.

Their targets? Officials, journalists, human-rights activists, and researchers.

This isn’t new.

North Korean IT workers have long used AI and deepfakes to build fake identities—sometimes even stealing U.S. identities to apply for jobs.

They appear in video interviews with AI-made faces and voices.

Cybersecurity expert Dawid Moczadło, co-founder of Vidoc, shared a video on LinkedIn that experts believe shows these workers in action.

At first glance it looks real, but if you watch closely—something feels off.

If these workers get hired, they don’t just collect a paycheck.

They can plant malware, steal company data, and funnel money back to North Korea’s weapons programs—helping the regime dodge sanctions.

AI can make life easier for everyone.

But in North Korea’s hands, it becomes a weapon—one that threatens your personal data, private companies, and even national security.