FEATURES OP-EDS CONVERSATIONS VIDEOS

 

> FEATURES



Blasphemy on Facebook: Challenges of Managing Sensitivities Online


by Lutfi Hakim
24 January 2018

 

Last year, there was an incident involving the local chapter of the atheist group Atheist Republic in Malaysia posted up a photo from their gathering in Kuala Lumpur on their Facebook page. The post attracted the attention of people outside that community in Malaysia. Their reaction shows us how potent sensitivities surrounding religion can be. The group’s page was inundated with comments and threats against their perceived apostasy, and the photo received prominent coverage in the Malay-language media. This was followed by official announcements by government ministers on the opening multi-agency investigations to track down the group.

The increasing degree of influence that social networks have around the globe today mean that issues such as these have to be given due attention, not only by academicians and civil society, but also governments and the companies that run those networks. The challenge lies in the fact that these networks operate across national boundaries which today means that they would have to ensure their compliance with the diverging national laws, including those that hamper the right to freedom of expression. To say that this poses a challenge, would be to grossly understate the situation, and definitely warrants further discussion and creativity for managing the many sensibilities and sensitivities that exist around the world.

Facebook offers us many opportunities for the study of the impact of algorithms that the company uses to present content to users. The huge amount of content uploaded by individuals, groups, companies and organisations mean that it is impossible for the increasing number of human moderators contracted by Facebook to sift through all of them swiftly within a reasonable amount of time. To ensure that the content that ends up in users’ content streams, Facebook’s ‘Newsfeed’, is appropriate, legal, and effective in sustaining users’ attention, the company uses algorithms to continuously rank, make visible, flag, and remove content based on data points that are generated by users’ and existing database of removed content.

It is obvious that it is no easy task for Facebook to keep its billions of users happy. There are vast cultural differences and sensitivities that need to be respected, yet at the same time the service has to be seen as a home for conversations to happen, free from the accusing eyes of a censor. This task is further complicated by the political complexities and pressure from governments that make the job of keeping the service of welcoming and safe place for users a monumental challenge.

The magnitude of this challenge can be seen in the sensitivities surrounding the issue of religion and belief. Blasphemy is a particularly contentious example. Statements that are considered blasphemous is still considered a crime in fifty-nine countries and statements that are seen as blasphemous can trigger strong reactions online and off, even when no official action is involved. Incidences related to blasphemy has led to the service being made accessible for short periods in Bangladesh and Pakistan in 2010. These threat of service interruptions weigh heavily on the company, and blasphemy related requests by governments are taken seriously even if they do not breach the service’s community guidelines.

It is not only countries that have a strong response towards content considered blasphemous, but users too react very strongly towards them. Besides online and offline outrage that has led to violence and loss of lives and property,users also organise themselves in some instances in a concerted way as vigilante moderators through public reporting campaigns. Content that are seen as blasphemous are frequently reported, and even when they do not clearly breach Facebook’s speech standards (i.e. not hate speech, demonisation), are removed from the service, and accounts that post such content can be suspended. Blasphemy can include strong insults to deities and practices, and rational arguments about the inexistence of god, or the mere expression of atheism, which means that potentially a wide range of expression can be circumscribed because it causes offence to religious individuals.

 

An example of a Facebook page set up to report anti-Hindu pages



Instructions for reporting Facebook pages on from a page set up to report anti-Islamic pages

 

India’s experience with blasphemy laws is an illustration of how laws can impact how Facebook decides on content on the website. India has laws on its statute that makes it a crime to make blasphemous comments, and the law has been invoked to remove and filter content from being made accessible in India without a proper hearing. Known instances of content removal and blocking of content include against groups that promote atheism and critical views on Hindu beliefs. The Supreme Court of India decided in 2016 that the proper interpretation of the Information Technology Act of 2000 that was in line with the protection of freedom of expression as enshrined in the Indian Constitution, required that governmental requests for removal of content need only be complied with if there is a court order, or if it was requested by an authorised government agency.

In the two-bench decision, the Supreme Court found Section 66A of the Act, which includes outlawing the communication of information that is meant to purposely cause annoyance and inconvenience. The Court ruled that the law failed to define terms such as annoyance and inconvenience, which could potentially impair the expression of “a very large amount of protected speech”.

The option for judicial review is not an option available in all jurisdictions. In authoritarian countries like Pakistan, where the will of the executive is not easily challenged, Facebook will have to comply with government requests for removal of content, on the basis of blaspheming, or other offences. Failure to comply will lead to a threat of a disruption of access to the service, which results in millions of users losing access to the service. Outside of China, North Korea, and Iran, Facebook is available and used by users in countries all around the world and despite these difficulties, has been valuable to minorities and disenfranchised communities to communicate with each other, and the wider public.

Facebook CEO Mark Zuckerberg’s 2018 pledge to ‘fix Facebook’ lists abuse, hate and interference by nation states as issues that require his personal attention, which suggests that Facebook has been listening to critiques on how the service currently operates. There is no easy solution to issues such as blasphemy when the service counts among its users both fervent believers and ardent non-believers. Zuckerberg had previously suggested that one solution was the adoption of default regional filters that would keep specific content away from users’ newsfeed in specific regions. However, as mentioned above, users do not only chance on content they disagree with, but also actively seek them out to report them for removal. Whether a more sophisticated filtering approach can mitigate the effects of such active reporting remains to be seen.

 

Lutfi Hakim is research lead at IMAN Research. His areas of interest include political thought and movements in Southeast Asia, community-based organisations, and digital media use in the region.

 

 

 

 

 

 

 

 

 

 

 

FEATURES OP-EDS CONVERSATIONS VIDEOS

 

2018  ©  The Affair • An occasional discussion on society, religion & politics