The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28
 
29
 
30
 
 
 
 
 

EU imposes election guidelines on big tech companies

DATE POSTED:March 27, 2024
EU imposes election guidelines on big tech companies

The European Commission, leveraging the powers granted by the newly implemented Digital Service Act, has called on digital platforms to intensify their efforts in protecting electoral processes from meddling and to implement targeted actions against the risks posed by generative AI technologies.

Brussels has issued a set of guidelines to major online platforms and search engines, including X, Meta, Google, TikTok, Snapchat, and others, cautioning that non-compliance could trigger the comprehensive enforcement of the DSA, potentially resulting in substantial penalties.

Platforms that do not adhere to these recommendations, or that do not present viable alternatives, might be subject to regulatory actions, risking fines that could amount to as much as 6 percent of their global revenue.

These recommendations were made in the context of the UK government expressing concerns over Chinese attempts to disrupt its democratic institutions, citing instances of cyberattacks on the UK Electoral Commission and lawmakers’ email accounts as actions linked to entities associated with the Chinese government.

EU imposes election guidelines on big tech companiesBrussels has issued a set of guidelines to major online platforms and search engines, including X, Meta, Google, TikTok, Snapchat, and others (Image credit)

The EU’s directives highlight “best practices” aimed at platforms with over 45 million users. This encompasses the strengthening of internal procedures and the formation of teams dedicated to addressing “local context specific risks.” Platforms must take steps to put in place “elections specific risk mitigation measures,” elevate the visibility of verified, reliable electoral information, and adjust their content suggestion algorithms to both empower users and curb the “monetisation and virality” of election-compromising material.

In the lead-up to elections, platforms are tasked with establishing tailored strategies, including protocols for swift incident response to lessen the effects of last-minute challenges like the spread of false information. Additionally, conducting thorough analyses after such events, collaborating with both European and national bodies, as well as with non-governmental organizations, is required, with expectations for reviews following the elections.

Furthermore, the EU is advocating for “specific mitigation measures” in the context of generative AI. This involves the transparent identification of AI-created materials, including deepfakes, and the rigorous application and enhancement of user agreements and policies.

Although referred to as “guidelines,” a commission official emphasized that “election integrity is a key priority for DSA enforcement.” There’s substantial concern over the potential impact on elections from various angles, including deep fakes, exploitation of recommendation algorithms, foreign disinformation campaigns, and efforts to incite division within European communities. “This is not trivial.”

The attention isn’t solely on deep fakes and AI’s implications; the commission also scrutinizes the adequacy of content moderation and fact-checking resources platforms employ, their local expertise, and the extent of their collaboration with governmental and other entities.

EU imposes election guidelines on big tech companiesThe attention isn’t solely on deep fakes and AI’s implications (Image credit)

Regulators acknowledge the impossibility of completely eliminating contentious content but stress the importance of having strategies to detect and curb its spread. One official highlighted the unique position of the Digital Services Act (DSA) in global election integrity efforts, noting its enforceability and the capacity for oversight on platform compliance, effectiveness of implemented measures, and the possibility of demanding data and assessments from platforms.

“The DSA contains a legally binding obligation to have effective mitigation measures in place,” stated the official. Platforms diverging from these guidelines are expected to provide a “serious explanation” for their chosen actions. Failure to comply could lead to “enforcement actions that can include fines up to 6% of global annual turnover, or daily penalties up to 5% of global annual daily turnover.”

EU cracks a whip on big tech: Apple, Meta, and Google face DMA probes

In preparation for the European Elections in May, the commission has initiated inquiries with platforms and scheduled a stress test in April. Given the European Parliament election’s continental scope, there’s an anticipation of increased pressure on moderation resources, necessitating specific adjustments by platforms.

Highlighting a case of particular interest, officials revealed an ongoing formal investigation into X (previously known as Twitter) for its DSA non-compliance, notably after the platform reduced its moderation efforts in a bid for free speech absolutism.

Featured image credit: Christian Lue/Unsplash