Rohingya Refugees File PIL in Delhi HC Urging Meta to Halt Spread of Hate Speech on Facebook

Rohingya Refugees File PIL in Delhi HC Urging Meta to Halt Spread of Hate Speech on Facebook

In an effort to address the issue of discriminatory and incendiary content targeting the Rohingya community, two refugees from the Rohingya group have initiated a Public Interest Litigation (PIL) in the Delhi High Court. The PIL requests the court to issue directives to Meta, formerly known as Facebook, urging the platform to cease the dissemination of content that promotes hatred and inflames tensions against the Rohingya community.

The plea submitted to Facebook calls for immediate action to discontinue the utilization of virality and ranking algorithms. These algorithms have been identified as contributors to the promotion of hate speech and the incitement of violence against minority communities. The request underscores the need to address the negative impact of such algorithms on online content and to prevent the escalation of harm towards vulnerable groups.

The High Court is anticipated to consider the petition later this month, highlighting the impending legal proceedings regarding the concerns raised in the plea. This timeframe suggests that a formal hearing or legal deliberation on the matter is scheduled in the coming weeks, indicating the court's attention to the issues presented in the petition filed against Facebook.

The petition has been submitted by Mohammad Hamim and Kawsar Mohammed, both of whom sought refuge in India after fleeing persecution in Myanmar. Mohammad Hamim arrived in July 2018, while Kawsar Mohammed reached India in March 2022. Their status as petitioners underscores the personal experiences and challenges faced by Rohingya refugees, adding a human dimension to the legal action against Facebook.

Through their legal representation by Advocate Kawalpreet Kaur, Mohammad Hamim and Kawsar Mohammed have asserted in their plea that Facebook is a breeding ground for misinformation, harmful content, and posts originating from India targeting Rohingya refugees. The petitioners contend that there is substantial evidence indicating that the platform is deliberately refraining from taking necessary action against such posts. This accusation underscores the need for legal intervention to address the alleged intentional negligence by Facebook in addressing the dissemination of harmful content.

The plea emphatically highlights that Facebook's algorithms actively promote the dissemination of harmful content, reinforcing the platform's role in perpetuating content that dehumanizes the Rohingya community. The argument asserts that Facebook played a significant role in the dehumanization of the Rohingya community in Myanmar. Furthermore, with the impending 2024 general elections, the plea warns of an elevated risk of widespread harmful content and misinformation originating on the platform. The potential consequence cited is the heightened likelihood of violence against the Rohingya community, underscoring the urgent need for corrective measures.

The petitioners have additionally asserted that Facebook is in breach of Section 79(3) of the Information Technology Act, coupled with Rule 3 of the Information Technology (Intermediaries Guidelines) Rules 2011. These provisions pertain to the obligations of due diligence that an intermediary, such as Facebook, is required to observe while fulfilling its duties. The claim suggests that Facebook has not met the stipulated standards outlined in these legal frameworks, thereby alleging the platform's failure to responsibly discharge its responsibilities as an intermediary in the digital space.

In light of these allegations, Mohammad Hamim and Kawsar Mohammed are petitioning the court to issue directives to Meta, urging the platform to take specific actions. They are seeking orders for Meta to suspend accounts that propagate hate against the Rohingya community. Additionally, the petitioners are requesting Meta to transparently disclose and report on how it implements its content moderation policies, particularly in addressing content flagged by users. This underscores their pursuit for accountability and proactive measures to curb the spread of harmful content on the platform.

 

Share this News

Website designed, developed and maintained by webexy