USA: Facebook and TikTok Allow Advertisements With Incorrect Information

App icons from TikTok, YouTube and Facebook on a smartphone
Only YouTube rejected all submitted ads. However, researchers criticise that the video platform needs to improve internationally.(Quelle: IMAGO / CTK Photo)

The social networks TikTok and Facebook placed advertisements in USA with incorrect and misleading information about the upcoming elections. This is according to a joint study conducted by the non-governmental organisation Global Witness and the “Cybersecurity for Democracy” (C4D) research project at New York University. According to them, only the video platform YouTube rejected all submitted ads.

On 8 November, midterm elections will be held for the U.S. House of Representatives and U.S. Senate. Beforehand, the researchers investigated whether the three platforms TikTok, Facebook and YouTube would detect disinformation about the midterm elections. As a test, they submitted paid ads: Ten advertisements per platform in both English and Spanish.

As shown in the study, half of the ads contained incorrect information such as the wrong election date. Other ads aimed to discredit the electoral process by, for example, claiming that the results had already been determined. These examples were in part sourced from themes of disinformation identified by the U.S Federal Elections Committee, all of which would violate the terms of the platforms.

The ads were to be shown to people in states like Arizona, Colorado and Georgia. Observers are expecting a particularly close election outcome there. In their test, the researchers decided to use ads rather than normal posts on social media since these can often be deleted by the platforms after being approved, without actually being shown to users.

TikTok approved almost all incorrect information

In the test, TikTok released 90 percent of the submitted ads – and as a result, performed the worst. Only one of each ad in English and Spanish was rejected. Even the user account maintained by the researchers was not blocked by the platform.

At Facebook, it was possible in the test to submit ads without a verified user account – this alone violates the platform’s rules for ads about elections. Furthermore, the corporation prohibits political advertisements in the USA submitted from abroad. However, according to the researchers, this was also possible. They used a total of three accounts, only one of which was closed by Facebook.

Facebook allowed three ads in English and two in Spanish that were submitted from abroad. In the subsequent test from the USA, the platform allowed the two ads in English and five in Spanish. In this respect, the decisions were not consistent: For example, an incorrect election date was allowed in both English language tests, but rejected in Spanish.

YouTube also blocks accounts

Only the video platform YouTube, which belongs to Google Groups rejected all submitted ads. Even the user account maintained by the researchers was blocked by the platform.

The experiment shows, with YouTube as an example, that platforms can enforce their policies in the USA. However, the researchers concluded that it shows a different picture internationally. Because in a similar investigation in Brazil, the video platform approved all ads with disinformation. Even Facebook, in a study in Brazil allowed all incorrect information (German language article). According to Global Witness, there is a big difference on the platform between moderation in the U.S. and other countries.

A spokesperson from Meta explained to Global Witness and C4D that the research was based on a small sample of ads. The process for verifying ads consists of multiple steps, both before and after an ad in question has been published.

TikTok informed the researchers that it is a platform for entertaining content – which is why they do not allow paid political ads. Responses from NGOs and other experts help the company to improve its processes and policies.

Platforms should improve

According to the data from the researchers, YouTube, Facebook and TikTok are among the most used platforms in the USA. Jon Lloyd from Global Witness stated: "For years we have seen key democratic processes undermined by disinformation, lies and hate being spread on social media platforms – the companies themselves even claim to recognise the problem. But this research shows they are still simply not doing enough to stop threats to democracy surfacing on their platforms.”

And Damon McCoy, co-director of the research project at the New York University cricitised: “So much of the public conversation about elections happens now on Facebook, YouTube, and TikTok. Disinformation has a major impact on our elections, core to our democratic system. YouTube’s performance in our experiment demonstrates that detecting damaging election disinformation isn’t impossible. But all the platforms we studied should have gotten an “A” on this assignment. We call on Facebook and TikTok to do better: stop bad information about elections before it gets to voters.”

Disinformation has increased significantly in the USA since the 2020 presidential election, Global Witness and C4D report. Therefore, the researchers call on Meta and TikTok to expand their moderation resources in view of the upcoming election. Additionally, the platforms should allow for an independent review of their policies and publish a risk assessment before the midterms. The researchers appealed to YouTube to make international improvements in order to detect and remove disinformation. (js)