Why in news?
The Government of India recently notified Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021.
About Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
• Rules 2021 has been framed by the Central Government in exercise of powers under section 87 (2) of the Information Technology Act, 2000 and in supersession of the earlier Information Technology (Intermediary Guidelines) Rules 2011.
Background
Following developments culminated in the notification of the IT rules, 2021:
• In December 2018, the Supreme Court (SC) in suo-moto writ petition (Prajjawala case) had observed that Centre may frame necessary guidelines to eliminate child pornography, rape and gangrape imageries, videos and sites in content hosting platforms and other applications.
• Subsequently, MeitY prepared the draft Information Technology (Intermediary Guidelines) Rules 2018 to replace the rules notified in 2011 to strengthen the legal framework and make the social media platforms accountable under the law.
• In October 2020, the SC had sought the Centre’s response on a Public Interest Litigation (PIL) for regulating Over-the-top (OTT) by an autonomous body.
• In November 2020 the Union government brought OTT platforms and news and current affairs content on online platforms under the ambit of the I&B ministry.
• In February 2021 the SC issued a notice to the Central Government seeking creation of a proper board, institution or association for managing and monitoring OTT, streaming and media platforms.
Factors that necessitated formulation of new rules
• Rapid growth in user base:
Due to Digital India programme and extensive spread of mobile phones, Internet etc. major social media platforms have huge number of users.
Indian OTT market is also set to reach Rs 237.86 billion by FY25, from Rs 42.50 billion in FY19.
• Failure of self-regulation:
Despite having internal mechanisms to tackle illegal and inappropriate content, Social media companies have thus far been incompetent to effectively address certain harrowing issues.
Also, the self-regulatory mechanism proposed by Internet and Mobile Association of India (IAMAI) was rejected by the I&B Ministry due to issues such as conflict of interest, lack of independent third-party monitoring, well-defined Code of Ethics etc.
• Ensuring safety and dignity of women:
Rampant abuse of social media to share morphed images of women and contents related to revenge porn have often threatened the dignity of women.
• Persistent spread of fake news and misinformation:
Fake news, rumours etc. spread virally through platforms like WhatsApp, Twitter etc.
There have been instances of targeted misinformation aimed at religious minorities and dissenting individuals, with consequences ranging from riots, death threats to actual murders.
• Threat to democratic institutions and security landscape:
Increasing instances of misuse of social media by criminals, anti-national elements have brought new challenges for law enforcement agencies.
• Need for an appropriate institutional mechanism:
There is no law or autonomous body to monitor and manage the digital content.
Also, there is no robust complaint mechanism wherein the ordinary users of social media and OTT platforms can register their complaint and get it redressed within defined timeline.1.1.1. GUIDELINES RELATED TO SOCIAL MEDIA INTERMEDIARIES Key provisions
• Due diligence to be followed by intermediaries:
Rules prescribe due diligence that must be followed by social media intermediaries like retention of user information for a period of 180 days, reporting cyber security incidents etc.
In case, due diligence is not followed by the intermediary, safe harbour provisions will not apply to them. o These safe harbour provisions have been defined under Section 79 of the IT Act, and protect social media intermediaries by giving them immunity from legal prosecution for any content posted on their platforms.
• Grievance Redressal Mechanism:
Intermediaries shall appoint a Grievance Officer to deal with complaints and share the name and contact details of such officer.
Grievance Officer shall acknowledge the complaint within 24 hours and resolve it within 15 days from its receipt.
• Ensuring Online Safety and Dignity of Users, especially Women Users:
Intermediaries shall remove or disable access within 24 hours of receipt of complaints of contents that exposes the private areas of individuals or is in the nature of impersonation including morphed images etc.
• Two Categories of Social Media Intermediaries i.e., social media intermediaries and significant social media intermediaries (SSMI):
This distinction is done to encourage innovations and enable growth of new social media intermediaries without subjecting smaller platforms to significant compliance requirement.
• Additional due diligence to be followed by SSMI include:
Appointment of a Chief Compliance Officer for ensuring compliance with the Act and Rules, Appointment of Nodal Contact Person for 24x7 coordination with law enforcement agencies o Appointment of a Resident Grievance Officer to perform the functions mentioned under Grievance Redressal Mechanism.
These above officers must be residents in India. o Publishing a monthly compliance report mentioning the details of complaints received, action taken on the complaints and details of contents removed.
Identification of the first originator of the information:
SSMI providing services primarily in the nature of messaging shall enable identification of the first originator of the information (without requiring disclosing the contents of any message) that is required only for the purposes of prevention, detection, investigation, prosecution or punishment of an offence related to: (refer to infographics given below) Publication of a physical contact address in India on its website or mobile app or both.
Deployment of technology-based measures:
To proactively identify information that depicts any act or simulation in any form depicting rape, child sexual abuse or conduct etc.
Voluntary User Verification Mechanism:
Users who wish to verify their accounts voluntarily to be provided an appropriate mechanism to do so with provision of demonstrable and visible mark of verification.
Giving users an opportunity to be heard: In cases where significant social media intermediaries removes or disables access to any information on their own accord, then a prior intimation should be communicated to the user notice explaining the grounds and reasons for such action and the user must be provided an adequate and reasonable opportunity to dispute the action.
• Removal of Unlawful Information:
An intermediary upon receiving order by a court or being notified by the Appropriate Govt. or its agencies should not host or publish any information which is prohibited under any law in relation to the interest of the sovereignty and integrity of India, public order, friendly relations with foreign countries etc.
Concerns associated
• Enhanced surveillance and threat to privacy of users: The encrypted messaging apps will need to collect more user data to trace messages back to the first originator, raising concern about misuse by both platforms and governments.
• Conflicts with extraterritorial jurisdiction norms made in the IT Act:
Since according to the new rules action can be taken against a message that originated outside India.
• Underdeveloped and imperfect nature of artificial intelligence (AI).
• Self-censorship:
Removal of safe harbor can lead to internal censorship by intermediaries which have impacts on users’ right to free speech.
• Vague threshold for qualification of SSMI:
These thresholds enable the Central government to enforce discriminatory compliances. However, there is no scientific criterion to set such thresholds.
• Potential for misuse of Verification data:
In the absence of a data protection law Social Media entities will be able to collect data of citizen IDs without any regulatory body to ensure that such data is being used only for verification.Intended Benefits
• Empowering the ordinary users of digital platforms to seek redressal for their grievances and command accountability in case of infringement of their rights.
• Improved deployment of new technologies such as machine learning tool which can help tackle child sexual abuse imagery (CSAM) or abuse.
• Safety of users by addressing illegal activities on social media uniform across all social media platforms and ensuring the safety of the majority social media users across India.
• Help law enforcement authorities by identifying the first originator of the information by ceasing and curbing the nuisance of fake news, child porn and other illicit activities.
• Clarify the responsibilities of intermediaries.Key provisions Rules establish a soft-touch self-regulatory architecture and a Code of Ethics and three tier grievance redressal mechanism for news publishers and OTT Platforms and digital media. They have been notified under section 87 of IT Act empowering the I&B Ministry to implement this part of the Rules which prescribe the following: Code of Ethics
• As applicable to OTT platforms: o OTT platforms, called as the publishers of online curated content in the rules, would self-classify the content into five age-based categories- U (Universal), U/A 7+, U/A 13+, U/A 16+, and A (Adult) based on factors such as themes and messages, violence, nudity, drug and substance abuse etc.
Platforms would be required to implement parental locks for content classified as U/A 13+ or higher, and reliable age verification mechanisms for content classified as “A”.
Platforms should prominently display the classification rating specific to each content or programme together with a content descriptor and advisory on viewer discretion at the beginning of every programme enabling the user to make an informed decision, prior to watching the programme.
Concerns
• Impact on Indian cinema/television: India cinema/television provides employment and entertainment to audiences locally and globally.
Rules may likely have a substantial impact on citizens’ digital rights, resulting in economic harm, and may also negatively impact India’s growing cultural influence.
• Unfair competition for small digital media houses:
The definition of “publisher of news and current affairs content” privileges the established media houses, who may have a print newspaper as a significant component of their operations and could thus claim to be exempted from these guidelines.
Smaller and independent media houses which rely on the internet to disseminate news and information will face enhanced compliance cost and censorship.
• Regulation of foreign news media:
If a digital news media organisation makes its content available in India in a systematic and continued manner, the provisions of the Intermediaries Rules will apply to them.
However, it is unclear how foreign news media organisations are sought to be regulated by Indian authorities.
• Forcing self-censorship:
Certain criteria provided in the Code of Ethics are vague, overbroad and can have a chilling effect on the free speech of publishers.
Also, certain rules can give formal validity to the illegitimate concerns which have been raised by certain groups against artistic content.
• Excessive governmental control over digital news and OTT content:
The Chairman of the self-regulatory body is suggested to be a retired Judge of the High Court or Supreme Court, and even though the body is expected to be appointed/elected by the media community, the I&B Ministry retains approval power over the composition of the body.
• Lack of punitive measures for violators:
The rules are more in the nature of guidelines and there is no effective mechanism for screening or provision for punishment /fine to take appropriate action against violators.
