16.11.2021

Content, Media and Literacy Across the Globe

This blog post was authored by Sofie Schönborn, Gustavo Souza, Stella Teoh, Tebogo Kopane, and reviewed by George Chen and Brian Nkala

We discussed Content, Media, and Literacy with policymakers at the third YouthxPolicyMakers roundtable, and we decided to use our diverse backgrounds to discuss our thoughts and ideas as youth in this blog post.

   The internet has created new opportunities for communication, allowing a younger generation to express their concerns and engage in political debates. We believe that the internet has the potential to galvanise youth voices, unite global action, and encourage groups such as climate change activists. However, in addition to the unprecedented opportunities offered by a global internet, there are challenges for current and future generations as a result of social and political tensions around internet issues.

     The need for content moderation emerges in a world of complex governance and distribution of power, where there are locally constituted legislation and worldwide platforms in an exercise to define what can be kept on-line (e.g., HK National Security Law, Australia vs. Facebook and Google). This is mostly related to governments and social media platforms but even credit card firms are becoming regulators of what is allowed on the web.

   Individual freedom of expression is compromised by unlawful actions by state or non-state entities to suppress certain content. Furthermore, cultural differences make content and media moderation a sensitive and multifaceted topic: what is sensitive or unlawful in one country or culture may be permissible in another. Thus, content must be considered in context. In light of this, we investigated, talked, and pondered on the current status of debates in our nations, highlighted key insights from discussions with officials, and identified the resulting topics that we believe are most pressing to our generation right now.

A general view from our countries

    Based on our conversations between youth around the world, from Africa, Americas, Asia and Europe, we noticed many distinctions in local debates regarding Content, Media and Literacy. Content moderation seems to be especially relevant these days. In the following sessions, we want to provide some examples of the different debates we are having in some of our home countries. We see digital literacy initiatives across the globe as an important priority to enable generations of young and old to be safe on the internet and enjoy their rights granted by international conventions and national laws. Following this, we will shed light on our experience at the roundtable with policymakers from governments and industry representatives and what we have taken away from it.

Intricacies in the African continent

    Content regulation policies are being widely debated, particularly in light of media misinformation. The discussion is largely focused on the need to regulate without infringing on users' basic human rights and freedoms. Policies governing content regulations on the African continent should be guided by human rights principles such as legality, legitimacy, necessity, and proportionality. The rise of misinformation and fake news has prompted many African governments to review and amend their existing regulatory laws. Countries such as Kenya, Tanzania, and Nigeria have rushed to amend their laws, resulting in flawed provisions that currently impede free expression, privacy, and stifle the press and dissenting voices. The one thing that digital laws have in common across the globe is their ambiguity. For example, in Ethiopia's Hate Speech and Disinformation Prevention and Suppression Proclamation, 2020, the term "hate speech" is so ambiguous that many social media users have been accused of breaking the law and arrested as a result.

        When it comes to content and media regulation, the African continent is a very interesting case because of its vast and diverse cultures, ways of life, norms, and historical complexity. This complication creates a significant challenge for content and media regulation implementation because context is not well represented. As a result, many African countries have been chastised for the manner in which digital policies and content regulation have been implemented. Despite the shift toward a more democratic ruling, it appears that many governments are using the ambiguity of digital policies to obstruct local human rights.

    The continent's precarity sparks a larger debate about online platforms, specifically how they can exist in the ever-changing landscape of content and media regulation in Africa. Social media platforms have played a significant role in exposing injustice and focusing attention on social issues. For example, the End Sars movement in Nigeria arose from a desire to put an end to police brutality, specifically the disbanding of the Special Anti-Robbery Squad. Through the use of Twitter and the hashtag #EndSARS, the movement was propelled and gained global attention. Because of the intense scrutiny and pressures on the government, as well as calls for Twitter CEO Jack Dorsey to deactivate President Buhari's Twitter account, the president made the indefinite decision to ban Twitter in the country. This decision was extremely unconstitutional and violated the country's most fundamental human rights. Furthermore, this hampered business operations, resulting in income loss for many. South Africa exemplifies how content regulation can exist without infringing on citizen human rights.

     In South Africa, the #FeesMustFall campaign grew into a nationwide student-led protest against higher education fee increases. Social media played a significant and positive role in disseminating 

information and involving various actors, including unions, political parties, and universities themselves. The existence of this movement, as well as the policies that allow it to exist, contribute significantly to improving a nation's socioeconomic standing and upholding the country's overall democracy.

Challenges in ASEAN

    According to the World Economic Forum, ASEAN is the world’s fastest growing Internet market with the potential to join the leading digital economies of the world by 2025. However, as a single community, with the exception of Singapore, ASEAN member states seldom perform well in digital indices.

     A key challenge that ASEAN faces is in terms of digital literacy, as seen in the rise of digital fraud, and cybersecurity vulnerability. Due to varying levels of economic development in ASEAN, this translates to difficulties in adopting a one-size-fits-all approach to raising literacy levels in the region. Critical digital literacy is a vital element for individuals to participate in the governance of their countries. Across South East Asia, the web has been used to revamp media landscapes and organise movements, which has prompted governments to crack down harder on regulation. With more users connected via social media, this has resulted in situations where citizens have shifted their trust from mainstream media to social media. However, without the necessary literacy skills (like evaluation and navigation), new and existing users are even more vulnerable to misinformation and disinformation. Hence, the vision of digital citizenship within the ASEAN Community should be realised with a strong foundation of digital literacy in the ASEAN population.

Brazil  perspective on the Internet Bill of Rights

    Even though Brazil is recognised  as a leading country in internet regulation based on a multistakeholder approach, there is still more work to be done. Some important principles have emerged since Brazil's Internet Bill of Rights. This includes (1) the liability principle for online intermediaries and (2) the net neutrality principle; the first relates to a common complaint from internet corporations. They are concerned about being held legally liable for user-generated content, but our approach promotes a legal system in which firms are held liable only if they fail to act after receiving a judicial order to do so. The net neutrality principle, on the other hand, forbids internet service providers, who are in charge of the infrastructure, from blocking content, for example. This is essentially a competition issue because internet providers are typically cable TV and telephone companies. As a result, they may make disproportionate economic profits by prohibiting content from being viewed over their internet infrastructure.

     Recently, the Internet Bill of Rights was amended by the president through an executive order. This possibility is part of Brazil’s constitutional framework, but was clearly unconstitutional and concerning especially because of the timing. When analysing state action, we often relate it to content removal and censorship, but a multistakeholder Internet Governance ecosystem can create some interesting situations. The decision, to amend the Internet Bill of Rights, happened prior to a protest supporting the government, so that content shared on social media could not be removed by platforms, without a legal order. The amendment was later rejected by Congress, but leaves behind a concerning legacy.

The Network Enforcement Act in Germany

     In Germany, a new legislation for content moderation has been under scrutiny over the last few years. The Network Enforcement Act (2017, also: NetzDG) introduced extensive compliance rules for commercial social network providers (with more than 2 million users) with fines up to five million euro regarding the handling of user complaints about hate crime and other criminal content online, as well as a responsibilities for networks including quarterly reporting, setting up a transparent complaints management system, and to designate an authorized representative in Germany. More precisely, it includes the requirement to examine complaints without delay, delete "obviously illegal" content within 24 hours, and delete or block access to any illegal content within seven days after examination. Complainants and users must be informed of the decisions taken without delay.

Some actors welcome the act as a way to force large social networks to take on social responsibility, to appoint a responsible contact person in Germany and to make decisions in accordance with German law. However, others are concerned about increasing restrictions of the freedom of expression. The United Nations Special Representative on Freedom of Expression David Kaye sharply criticized the planned regulations in a statement to the German government. He said they would overshoot the mark by far and impose too great a responsibility on platform operators and be out of line with international human rights conventions. Some might be concerned about the law being a template for authoritarian regimes to restrict freedom of expression.

    Experts expected that the rigid deletion deadlines and the high threat of fines would lead networks to prefer to remove posts in case of doubt, even if the freedom of expression would require a contextual weighing, for example, in distinguishing between prohibited insults and permitted satire. So, after four years, have we seen the act lead to the end of anonymity on the internet as some net activists expected? Since 2017, more than 1,000 content moderators work for Facebook in Germany. According to the first published reports from Facebook, YouTube and Google, most user complaints are rejected. Interestingly however, it seems like the act has led operators to use their own community standards rather than the criminal law to remove even legal content when in doubt.

Highlights from our roundtable

The third YouthxPolicyMakers roundtable led us to reflect the responsibilities and accountabilities of stakeholders involved in content and media governance, as well as in strengthening digital literacy. Who should be responsible to ensure rights-preserving policies, counter misinformation-campaigns by individuals or governments, and provide literacy campaigns? 

We want to bring the attention of policymakers to the relevance of rights-based content moderation, media and literacy initiatives for young and future generations. We need to bring every interested actor together and create transparent multi-stakeholder processes on regional, national, and international levels to share best practices and ideas across cultural spheres and contexts. Moreover, we want to explore possibilities for international conventions for basic digital rights.

What seems to be important

In our discussions over the last weeks, four issues have crystallized as dominant. Firstly, context is important for content moderation.  Although the Internet has brought us very close to each other, there are still cultural differences that make a uniform, one size fit all solution unfeasible. Hence, we need context specific regulation that equally respects and upholds the rights, rules and regulation of the said community. Secondly, most users aren't aware of how content is recommended or regulated. Hence, platforms must make their decision making transparent and also be clear in communicating policies and regulations to the users in a more lucid and accessible manner. Thirdly, multi stakeholder engagement is an important path to create shared responsibilities. Within the IGF, a Dynamic Coalition should be established to foster this exchange and include civil society and academia in the discussion. Finally, digital literacy initiatives are needed to enable digital citizens to participate in the digital sphere and voice arising concerns over regulation. It is only through addressing the problem of digital literacy that we can avoid building castles in the air to solve problems related to content, media and the bigger picture of citizenship in an online environment. Allowing the everyday user to develop skills necessary for life in the new normal, is empowering them to be able to participate in multi-stakeholder discussions. 

A survey within the roundtable attendees exhibited a disfavor of regulation by state or platforms. In this context, we would like to highlight the importance of “digital citizenship”. Digital citizenship involves more than just being able to adeptly navigate an online or virtual environment, but that in the process of working towards equity and change, such citizenship would encourage one “to confront complex ideas about the enactment of identities and dialogue online” together. This form of participatory digital citizenship requires a solid foundation in digital literacy—critical digital literacy—where individuals utilise their abilities to “civically participate” in the online world. The inculcation of a sense of digital citizenship will encourage youths to think of future ways in which they can ethically engage in to build a more equitable world online and offline.

Partners

a project by
heroteaser
heroteaser
supporting partners in 2019