Opaque Moderation Rules of Tech Giants Impact Half a Million Hungarians

Pixabay
A recent study commissioned by Hungary's National Media and Infocommunications Authority reveals that Facebook's opaque moderation policies may have restricted up to half a million Hungarian users. The research highlights the lack of transparency in tech giants' content rules.

A new study commissioned by the National Media and Infocommunications Authority (NMHH) has shed light on the opaque content moderation practices of major online platforms, including Facebook and YouTube. The findings, shared by the NMHH’s communications department, reveal that up to half a million Hungarians may have been affected by Facebook’s restrictive measures, underscoring the growing impact of platform policies on digital freedom.

The study, authored by Zsolt Ződi, examined the content moderation practices of Facebook and YouTube, focusing on how these platforms restrict users. According to the report, tech giants make millions of moderation decisions monthly, relying heavily on artificial intelligence. Often, these automated decisions are not reviewed by human moderators, leaving users without meaningful explanations for restrictions.

Under the EU’s Digital Services Act (DSA), online platforms are now required to report their moderation practices. These reports have revealed that platforms make hundreds of thousands of restrictive decisions daily, the majority of which are executed with the help of algorithms. However, the study notes that users rarely receive detailed justifications for these decisions, leaving moderation processes largely opaque.

‘Approximately 15 per cent of Hungarians—an estimated 500,000 people—have experienced content restrictions, deletions, or account suspensions’

Users are technically allowed to appeal moderation actions, but these appeals are also often handled by artificial intelligence. As a result, reversing decisions or restoring content has become increasingly rare. The report highlights that this trend, coupled with disorganized and often illogical platform rules, makes it even harder for users to navigate these systems.

For instance, platforms frequently use vague language in their policies. YouTube’s terms of service, for example, allow for the removal of not only illegal or rule-violating content but also any material deemed potentially harmful to the platform itself.

The study specifically explored how these moderation practices affect Hungarian users. While platforms do not provide country-specific data, a survey by the National University of Public Service found that approximately 15 per cent of Hungarians—an estimated 500,000 people—have experienced content restrictions, deletions, or account suspensions. This marks a 5 per cent increase compared to previous years.

Among affected users, half reported facing multiple restrictions, and a quarter experienced account suspensions. While a third of users have requested the reversal of restrictive actions, only one-tenth of these appeals were successful in 2024. This represents a significant decline from 2020 when platforms restored about one-fifth of restricted content. The shift is attributed to the increasing reliance on artificial intelligence for handling complaints.

The study stresses the need for platforms to provide users with meaningful explanations for moderation decisions. It also calls for greater access to human customer support, suggesting that platforms should hire significantly more staff to handle user complaints.

In a digital world dominated by a few tech giants, transparency and accountability in content moderation have become crucial. As the study highlights, ensuring users can challenge restrictions effectively and access human oversight is essential to safeguarding digital rights in the ever-evolving online landscape.


Related articles:

Sports the Most Popular, Political Content Attracts the Most Comments on Social Media, Study Finds
TikTok Use Hinders Fake News Recognition Skills, Media Authority Finds

A recent study commissioned by Hungary's National Media and Infocommunications Authority reveals that Facebook's opaque moderation policies may have restricted up to half a million Hungarian users. The research highlights the lack of transparency in tech giants' content rules.

CITATION