There were hundreds of them, they collected tens of thousands of engagements and hundreds of thousands of examinations. In early November, the MIT Technology Review found dozens of duplicate live fake videos from this time frame. One duplicate with more than 200,000 and 160,000 views, respectively, said in Burmese: “I am the only one that broadcasts live from all over the country in real time.” Facebook removed a few of them after we drew his attention to them, but dozens more, as well as the pages that posted them, still remain. Osborne said the company is aware of the problem and has significantly reduced these fake lives and their distribution over the past year.
Ironically, Rio believes, the videos were most likely taken from a video of the crisis posted on YouTube as evidence of human rights. In other words, the scenes are really from Myanmar – but they were all published from Vietnam and Cambodia.
Over the past six months, Rio has tracked and identified several groups of sites that have disappeared in Vietnam and Cambodia. Many used fake live videos to quickly increase the number of their followers and persuade viewers to join Facebook groups disguised as pro-democracy communities. Rio is now worried that Facebook’s latest introduction of in-stream ads in live videos will further encourage click lures to fake them. An 18-page Cambodian cluster began posting very damaging political disinformation, reaching a total of 16 million engagements and an audience of 1.6 million in four months. Facebook shut down all 18 pages in March, but new clusters continue to grow, while the rest remain.
As far as Rio knows, these Vietnamese and Cambodian actors do not speak Burmese. They probably don’t understand Burmese culture or country politics. The bottom line is that they don’t need to. Not when they steal their content.
Rio has since found several Cambodian private Facebook and Telegram groups (one with more than 3,000 individuals), where they trade tools and tips on the best earning strategies. The MIT Technology Review reviewed the documents, images, and videos it collected and hired a Khmer translator to interpret a video tutorial that guides viewers step-by-step through the workflow of click bait.
The materials show how Cambodian operators collect research on the best-performing content in each country and plagiarize it for their click-to-click websites. One Google Drive map shared within the community has about twenty tables with links to the most popular Facebook groups in 20 countries, including the US, UK, Australia, India, France, Germany, Mexico and Brazil.
The video tutorial also shows how they find the most viral YouTube videos in different languages and use an automated tool to turn everyone into an article for their website. We found 29 YouTube channels that spread political misinformation about the current political situation in Myanmar, for example, which were turned into articles about click lures and distributed to new audiences on Facebook.
After we drew his attention to the channels, YouTube canceled them all for violating its community guidelines, including seven that it determined were part of a coordinated influence operation linked to Myanmar. Choi noted that YouTube had also previously stopped showing ads on nearly 2,000 videos on these channels. “We continue to actively monitor our platforms to prevent bad actors who want to abuse our network for profit,” she said.
Then there are other tools, including one that allows previously recorded videos to appear as fake Facebook Live videos. Others are randomly generated profile details for men from the USA, including picture, name, birthday, social security number, phone number and address, so another tool can mass produce fake Facebook accounts using some of that information.
Now it is so easy that many Cambodian actors work solo. Rio calls them micro-entrepreneurs. In the most extreme scenario, she saw individuals manage as many as 11,000 Facebook accounts on their own.
Successful micro-entrepreneurs also train others to do this job in their community. “It will be even worse,” she says. “Any Joe in the world could influence your information environment without you realizing it.”