Five global views on regulating social media
Over the past couple of years the debate over who – if anyone -, how, and why should regulate social media has become increasingly heated, just to take a full sprint in the second half of 2020. Anti-trust hearings, privacy concerns, and data privacy and protection laws have seemed to multiply, expedited by the massive shift to digital caused by the coronavirus pandemic.
Here is a summary of five different views from around the globe on the need, the case, and the means to regulate social media platforms, ranging from auto-regulation to international laws and meta-regulation.
1. Grievance officers to hold social media companies more accountable
India’s government’s announced earlier this week of new, more stringent rules to regulate social media in the country, brings the topic of social media regulation to the centre of public discussion, yet again. This new self-regulatory framework for social media companies aims to hold them “more responsible and more accountable” for content on their platforms. These guidelines establish a “grievance redressal mechanism” for users, including official “grievance officers” who acknowledge complaints within 24 hours and resolve them within 15 days. Services must also remove nudity and sexually explicit content within 24 hours of a user flagging it. Additionally, companies such as Facebook or Twitter must appoint India-based officials that work with law enforcement and publish a monthly report on their moderation activity. It’s worth recalling that the major social networks have struggled to navigate issues such as hate speech and political conflict in the Asian country.
2. Country-specific permission to operate
Across the border, in Mexico, President Andres Manuel Lopez Obrador, has been very vocal about the role of social media platforms since Twitter banned Donald Trump in January. The Mexican premier called the ban an affront to freedom of expression and an act of censorship that set a dangerous precedent. “The Statue of Liberty in New York is turning green with rage,” he said. A month later, he has moved past words and had his party present a draft bill to curb the power of social media platforms. Many have already critised it, arguing that it is too bureaucratic, that borders on state censorship and even violate international treaties.
3. Human moderators to oversight automated content review
For their part, social media companies have argued that their policies are difficult to enforce. It can be tricky at times to distinguish hate speech from satire or commentary, for example, note Anshu Siripurapu and William Merrow from the Council On International Relations. They point out that these platforms generally comply with the laws of the countries where they operate, which can restrict speech even further. In addition to using moderation software powered by artificial intelligence, Facebook, Twitter, and YouTube employ thousands of people to screen posts for violations.
4. Put control back in the hands of governments
In France, President Macron has become increasingly supportive of returning control over social media platforms to government officials. He calls for international regulation in an attempt to stem those threats to democracy made apparent in the aftermaths of the assault to the U.S. Capitol on January, 2 2021. He added that it represented the West’s failure to rein in social media platforms, allowing them to become incubators of hate, moral relativism and conspiracy theories. Why it matters: Speaking at an Atlantic Council forum, Macron said the decisions to ban Trump by platforms like Twitter and Facebook may have seemed sensible in the short term, but did not provide a “democratic answer.” “I don’t want to live in a democracy where the key decisions… is decided by a private player, a private social network. I want it to be decided by a law voted by your representative, or by regulation, governance, democratically discussed and approved by democratic leaders,” he added. “The new violence in our democracies, largely linked to these social networks” is now “our new way of life,” Macron said.
Macron thus predicates frameworks such as that of Germany, where regulations that go beyond corporate accountability place guardrails on the kind of speech that would be allowed on these platforms.
5. ‘Meta-regulation’ of content regulation
A recent report by the Forum for Information and Democracy, proposes to evolve from content regulation to meta regulation (regulation of the corporate actors that dictate the moderation rules.) The organisation, established to make non-binding recommendations to 38 countries, explains that this could be done by developing a set of principles that platforms and social media will have to accept, in accordance with international standards of freedom of opinion and expression. Other recommendations made by this group include
- Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law: legality, necessity and proportionality, legitimacy, equality and non-discrimination.
- Platforms should assume the same kinds of obligation in terms of pluralism that broadcasters have in the different jurisdictions where they operate. An example would be the voluntary fairness doctrine.
- Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review, and particularly, in at-risk countries.