This article was originally published by Radio Free Asia and is reprinted with permission.
Accounts serving the aims of the Chinese Communist Party are increasingly taking to Facebook to air “inauthentic content,” a new report by an Australian think tank has found.
In an investigation into inauthentic activity on Facebook, the Australian Strategic Policy Institute (ASPI) found that fake accounts are posting content in English and Chinese “that support[s] the political objectives of the Chinese Communist Party (CCP).”
Post topics have included the U.S. Government’s decision to ban Chinese-owned video app TikTok, the George Floyd and Black Lives Matter protests, ongoing tensions in the US–China relationship, as well as the U.S. government’s response to the coronavirus pandemic, the report found.
“There’s considerable diversity in the accounts and pages linked to this activity set on Facebook,” the report by Elise Thomas, Albert Zhang, and Jake Wallis said.
“Most of the accounts we discovered were created or became active in 2020. Some have been active since January or February this year, while others appeared only in September 2020.”
It said some of the accounts posting about the pandemic have also previously posted on other topics close to the CCP’s heart, including attacks on exiled billionaire Guo Wengui, also known as Miles Kwok.
The report focused in particular on the activity of 33 Facebook accounts, which often displayed account usernames combining Chinese characters and often random English alphabetic characters, with cartoons in place of profile photos.
Their posts typically included short videos uploaded directly to the platform, it said.
“The actors behind information operations play an adversarial game with the social media platforms, evolving their tactics to find new vulnerabilities,” the report said, adding that operators often switched tactics to avoid being identified as inauthentic by the platforms they use.
It said previous analysis suggested that individual operators may be operating independently, leading to greater diversity in the way the content gets presented online.
Accounts possibly coordinated
The report said that it couldn’t conclusively determine that the 33 accounts were backed by China, but that they showed signs of being inauthentic, coordinated, and possibly automated.
“The cross-platform activity we analyze presents narratives that support the political objectives of the CCP and involve a significant investment in time and resources over a period of months, implying a well-resourced, persistent and patient actor,” it said.
It cited the removal of more than 2,000 YouTube channels since April by Google following a Threat Analysis Group investigation into coordinated influence operations linked to China.
The report said one indication of inauthentic behavior is repeated engagement with posts by the same set of accounts.
“The Facebook accounts and pages identified in this report also displayed highly coordinated and possibly automated behavior to create inauthentic engagement,” it said, adding that, on Aug. 24, more than 30 unique accounts liked, commented on, and shared the same five posts from pages titled ‘Together to fight the epidemic’, ‘Fight-the-epidemic’, ‘Health First’, ‘Peaceful coexistence of the world’ and ‘China will win.’
“All of these interactions are from the same group of appropriated or newly created accounts,” it said, adding that one or no friends or connections was also commonly found among the suspicious accounts, another potential marker of inauthenticity.