Beijing tested AI-generated content in Taiwan to sway voters from a pro-sovereignty candidate
Microsoft has cautioned that China intends to disrupt elections in the US, South Korea, and India this year using AI-generated content, following a trial during Taiwan’s presidential election. The US tech giant stated that it anticipates Chinese state-backed cyber groups, along with North Korea, to target prominent elections in 2024, as outlined in a report by the company’s threat intelligence team released on Friday.
“As populations in India, South Korea, and the United States head to the polls, we are likely to see Chinese cyber and influence actors, and to some extent North Korean cyber actors, work toward targeting these elections,” the report stated.
Microsoft stated that China would, at the very least, produce and circulate AI-generated content via social media to “advance their positions in these prominent elections.” The company cautioned that although the impact of AI-generated content has been minimal so far, this could change.
“While the current influence of such content on audiences remains limited, China’s increasing exploration of enhancing memes, videos, and audio will persist—and could become more impactful in the future,” Microsoft explained.
In its report, Microsoft revealed that China had previously conducted an AI-generated disinformation campaign during the January presidential election in Taiwan. This marked the first instance of a state-backed entity employing AI-generated content to influence a foreign election, according to the company.
During the Taiwanese election, a Beijing-supported group known as Storm 1376, Spamouflage, or Dragonbridge, was highly active. Their efforts to influence the election included posting a fake audio clip on YouTube of the candidate Terry Gou— who had withdrawn from the race in November— endorsing another candidate. Microsoft indicated that the clip was “likely AI-generated.” YouTube removed the content before it reached a wide audience.
The Beijing-backed group also circulated a series of AI-generated memes targeting the ultimately successful candidate, William Lai, who supported Taiwan’s sovereignty and was opposed by Beijing. These memes made unfounded allegations that Lai had embezzled state funds. Additionally, there was an increase in the use of AI-generated TV news anchors, a tactic also employed by Iran. These “anchors” made unsubstantiated claims about Lai’s personal life, including allegations of him fathering illegitimate children.
Microsoft noted that the AI-generated news anchors were produced using the CapCut tool, developed by the Chinese company ByteDance, which also owns TikTok.
Additionally, Microsoft highlighted ongoing influence campaigns by Chinese groups in the US. It stated that Beijing-backed actors are employing social media accounts to pose “divisive questions” in an effort to grasp the issues dividing US voters.
“This could be a strategy to collect intelligence and gain insight into key voting demographics ahead of the US Presidential election,” Microsoft stated in a blog post accompanying the report.
In one post on X, formerly Twitter, a $118 billion bipartisan bill in the US, which included a $20 billion investment in the US-Mexico border and a $75 billion package for Ukraine and Israel, was mentioned. The post asked, “What’s your reaction?” Another post highlighted the loss of an F-35 fighter in South Carolina last year, stating that “only under the Biden administration” could such a valuable piece of military hardware be lost. However, debris was found shortly after the incident. The post asked, “What do you think about this?”
The report was released during the same week that a White House-appointed official review board stated that “a cascade of errors” by Microsoft allowed state-backed Chinese cyber operators to breach the email accounts of senior US officials. Last month, the US and UK governments accused China-backed hackers of conducting a years-long cyber campaign targeting politicians, journalists, businesses, and the UK’s election watchdog.