Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Social media giants Meta and X ads that are approved on users in Germany with violent anti-Muslim and anti-Jew hate speeches in the run-up to the national elections, according to new studies of EkoA non -profit campaign group for corporate responsibility.
The researchers of the group tested whether the AD-Review systems of the two platforms submit to advertisements that contain hateful and violent messaging minorities before an election in which immigration was used at the center of the political discourse in the mainstream discourses, approved or reject. Demand that immigrants are locked up or gassed in concentration camps; and ai-generated pictures of the combustion of mosques and synagogues.
Most test advertisements were approved within a few hours after checking in mid -February. The federal elections will take place on Sunday, February 23.
EKO said X approved all 10 hate speeches that his researchers submitted only a few days before the federal elections, while Meta approved half (five ads) for running on Facebook (and possibly also Instagram) -although the others Five had rejected.
The reason why Meta provided for the five rejections indicated the platform that there could be risks of political or social sensitivity that could influence the vote.
However, one of the five ads approved by Meta included violent hate speeches, the Muslim refugees with a “virus”, “verification” or “rodents”, the Muslim immigrants as “raped” brands and demand, sterilize, burn or anchored. Meta also approved an advertisement in which synagogues are to be lit to “stop the globalistic Jewish rat agenda”.
As an ancillary care, EKO says that none of the images with AI-generated images have been designated as artificially-and still half of the 10 ads was still approved by Meta, regardless of how the company had one Guidelines that requires the disclosure of the use of AI images For ads about social issues, elections or politics.
X meanwhile approved all five of these hateful ads – and another five, which contained similarly violent hate speeches on Muslims and Jews.
These additional approved ads included messaging that attacks “rodents” immigrants, of which ad copy claimed economic power.
The latter display was combined with ai-generated images that represented a group of shady men who were sitting at a table that was surrounded by stacks of gold sticks, with a Davidstern on the wall above them, which are also strong in Anti -Semitic tropics lean.
Another approved advertisement X contained a direct attack on the SPD, the middle-left a violent answer. X has also properly planned a complaint that indicates that “left” “open borders” want and demand the extermination of Muslims “rapist”.
Elon Musk, the owner of X, has used the social media platform on which he has almost 220 million followers to personally intervene in German elections. In A tweet in DecemberHe asked the German voters to support the extreme right AfD party to “save Germany”. He also organized a live stream with the leader of the AfD, Alice Weidel, on X.
The EKO researchers deactivated all test advertisements before everyone had been approved to ensure that no users of the platform were exposed to violent hate speech.
It is said that the tests emphasize bright mistakes with the approach of the advertising platforms to moderate content. In fact, it is not clear in case X whether the platform will moderate ads, since all 10 violent hate speeches have been quickly admitted.
The results also indicate that the advertising platforms could achieve income due to the distribution of violent hate speeches.
The EKO tests indicate that no platform properly enforced the bans for hate speeches. In addition, EKO came in the case of Meta after conducting the same conclusion A similar test In 2023 in front of the new EU -Online -Governance rules that suggest that the regime has no effect on how it works.
“Our results indicate that the AI-controlled AD moderation systems from META remain fundamentally broken, even though the Digital Services Act (DSA) now works in full,” an EKO spokesman told Techcrunch.
“Instead of strengthening the AD checking process or the guidelines for hate speeches, Meta seems to be returning across the board,” they added and pointed to them The recent announcement of the company via roles and guidelines for the fact examination As a sign of the “active regression” that you suggested, she uses a direct collision course with DSA rules for systemic risks.
EKO has submitted its recent knowledge of the European Commission, which monitors enforcing the most important aspects of the DSA on the couple of social media giants. It also said that it shared the results with both companies, but neither answered.
The EU has DSA examinations for meta and open examinations XIncluding concerns about elective and illegal content, but the Commission has completed these procedures. Although, in April It said it suspects Meta inadequate moderation of political ads.
A preliminary decision about part of his DSA examination of X, which has been announced In JulyContains the suspicion that the platform does not correspond to the AD transparency rules of the regulation. The complete investigation that started in the foundation December 2023Also affects illegal content risks, and the EU does not yet have to result in results for the majority of the probe over a year later.
Confirmed violations of the DSA can attract punishment of up to 6% of global annual sales, while systemic non -compliance can even lead to regional access to violations of platforms temporarily blocked.
But for the time being, the EU still takes time to decide on the meta and X probes – until final decisions – all DSA sanctions remain in the air.
In the meantime, it now only takes a few hours before the German voters go to the surveys-and a growing group of research results in civil society indicates that the flagship of the EU online government regulation is the democratic process of the most important EU economy from one Series of technically driven techs did not protect threats.
Global witness at the beginning of this week released The results of tests from X and Tikoks Algorithmic “Für you” feed in Germany, which indicates that the platforms for the promotion of the AfD content are biased compared to the content of other political parties. The researchers of civil society also have it Intiquation X of blocking data access In order to prevent you from studying electoral security risks in the run -up to the German survey, access to the DSA should enable.
“The European Commission has taken important steps by opening DSA investigations for both META and X. Now the Commission must take strong measures to pronounce the concerns issued in the context of this investigation,” said EKO spokesman.
“In addition to increasing evidence of other civil society groups, our results show that Big Tech will not voluntarily clean up its platforms. Meta and X continue to allow illegal hate speeches, incitements for violence and election disinformation despite their legal obligations within the framework of the DSA, ”added the speaker. (We held back the name of the speaker to prevent harassment.)
“The supervisory authorities have to take strong measures in the enforcement of the DSA, for example also in the implementation of measures before the elections. This could include the switch-off of profile-based recommendation systems immediately before the elections and the implementation of other reasonable “break glass” measures in order to prevent the algorithmic reinforcement of border content such as hateful content in the pre-up elections. “
The campaign group also warns that the EU is now exposed to pressure from the Trump administration to alleviate its approach to regulating Big Tech. “In the current political climate, there is a real danger that the Commission will not fully enforce these new laws as a concession for the United States,” she suggests.