The March edition of Au courant features an interview with Mr. Prasanth Sugathan, Legal Director, Software Freedom Law Center, wherein he discusses the latest Information Technology Rules.
Prasanth Sugathan is a lawyer with extensive practice in the fields of intellectual property law, technology law, constitutional law, and administrative law. He has worked for years with the Free Software community in India. He is currently serving as the Legal Director of Software Freedom Law Centre, India (SFLC.in) and is also a Partner at Sugathan & Associates. An engineer-turned lawyer, he has appeared in various cases before the Supreme Court of India, High Courts, and Tribunals. He has researched well and developed an informed understanding of issues related to Net Neutrality and Information Technology. He has over six years of industry experience and over five years in legal practice. Furthermore, he has also edited a book on service law.
1. Government of India introduced the new Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 last month which require applications like WhatsApp, Telegram, Signal to enable tracing the origin of flagged messages and break end-to-end encryption. What are your initial thoughts about these rules?
Rule 4(2) of the Rules is an amended version of the previous draft Intermediary Guidelines which was published in December, 2018. This provision requires a significant social media intermediary providing services primarily in the nature of messaging to identify “first originator” of the information on a computer resource. In order to acquire such information, the Rules require an order issued under Section 69 of IT Act or a judicial order. The provision is problematic in several quarters.
Firstly, it is an either/or situation for a judicial order and an order under S. 69, IT Act. Section 69 does not offer adequate procedural safeguards and the orders under S. 69 are not available in the public domain. RTI requests filed by SFLC.IN seeking the number of decryption orders passed by the government each year have been denied in the past, citing S. 8 of the RTI Act, 2005. Law enforcement agencies can easily bypass the judicial process by relying on decryption requests made under S. 69 of the IT Act, 2000, thereby undermining accountability and transparency principles.
Secondly, the provision provides that where the first originator of any information on the computer resource of an intermediary is located outside the territory of India, the first originator of that information within the territory of India shall be deemed to be the first originator of the information. In such a scenario, the intermediary should have access to the metadata of the entire chain of the conversation. So, the messaging applications have to be re-engineered to capture metadata to implement this mandatory requirement. This provision will undermine end-to-end encryption and will severely impinge on security and privacy of communications. This is because in order to comply with the traceability provision, there is a high likelihood that the significant social media intermediaries will have to break end-to-end encryption and access the contents of a message, thereby compromising the privacy of communication. This will considerably weaken security of endto-end encrypted platforms. This move would severely dent the privacy by design principle by acting both ways i.e. being a valuable target for malicious third parties.
In addition to this, it would still pose a major challenge in courts to prove who the originator of
information is. For instance, someone who took a screenshot of a tweet and shared it with a friend, may not be the actual originator of that information. It would potentially be a challenge in the court to attribute mens rea to such originators. In a nutshell, this provision will undermine privacy and right to free speech, and it will severely impact the sanctity of end-to-end encrypted communications.
2. What are your thoughts upon the new framework in terms of how these entities will now have to equip themselves? And will this framework be same for the foreign news media?
(Details of the above question, if required)
Within the next three months, such platforms will have to appoint a resident as:
Chief Compliance Officer who will be responsible and liable for ensuring compliance with the Information Technology Act and rules.
• Nodal Contact Person for 24x7 coordination with law enforcement agencies.
• Resident Grievance Officer to address user complaints.
• Such platforms will need to have a physical address in India.
They’ll also have to publish a monthly compliance report mentioning the details of complaints received and action taken, as well as details of contents removed proactively.
a) The new framework under the rules burdens the companies with multiple sets of compliances which add financial and operational burdens, diminish the ease of doing business in India and create an environment of uncertainty in terms of consequences of non-compliance or non-adherence with the guidelines.
b) The requirement for having a physical address in India creates an operational and financial burden, especially on smaller companies.
c) In the context of appointment of personnel, the platforms will have a difficult time finding an employee who would be keen and willing to take up such a responsibility and at the same time ensure that during the course of his/her employment, the company functions flawlessly to protect him/her from any legal liability for the actions or omissions of the company.
d) (The response to compliance reports given in the next question)
e) Applicability of the framework to foreign News Media - Rule 8(2) states that the rules shall be applicable to a "Publisher" which "operates in the territory of India" or "conducts systematic business activity of making its content available in India". The rule makes it amply clear that the regulatory framework in the rules would also be applicable to foreign news media
3. What is your opinion about the compliance requirement for Significant Social Media Intermediaries, especially the “monthly compliance report”? Don’t you think that it is a bit impracticable?
The obligation in respect of publication of a monthly compliance report is a step in the right direction. This would facilitate transparency and accountability from social media companies which have so far not been answerable in terms of their content moderation decisions. Availability of more data in the public domain would also ensure cooperation and collaboration between social media giants and it would also help smaller companies (which are not as well equipped in terms of resources) learn and make better decisions, and implement better technological tools. Instead of making it monthly, the obligation should be modified to publish it quarterly or once in six months, to make the compliance process easier for companies.
4. Both for Social Media Intermediaries and Significant Social Media Intermediaries the rules say that they shall endeavour to deploy tech tools to identify content related to rape, sexual, child abuse and doing so measures should account for free speech and expression as well as privacy of the users. Don’t you think that too much of an onus is put upon the players?
Rule 4(4) talks about the obligation upon significant social media intermediaries (SSMI) to deploy technology-based measures to identify harmful content. The rule uses the phrase "shall endeavour to deploy technology-based measures". The rule puts an obligation upon SSMI to identify certain undesirable categories of information, using automated tools on a best-efforts basis.
The first proviso to the said rule states that the measures taken by the intermediary shall be proportionate and shall have regards to free speech and expression and privacy of users.
Technological tools for proactive filtering or monitoring of content have been in use by social media intermediaries for a while now. There are some serious problems with these automated tools and sometimes intermediaries end up taking down content that is harmless. While a lot of these tools lack the subjectivity, which is necessary to make a fair decision in respect of identification or removal of such content, the sheer volume of information that is hosted upon social media intermediaries on a daily basis, and the potential harm that certain categories of these information can cause, necessitates that automated tools and other technological tools are implemented to make the internet a safe place for its users.
Facebook and other companies have had some success with using AI to find problematic content, but it has been limited and unreliable. Currently, no AI that is available on the market is trained well enough to understand the eccentricities of human speech, context, slangs, dialect, satire or puns. As consumers of social media will know, most of problematic content is about context.
In the terms of respecting the sensitivities of free speech, it will take a while for automated filters to reach a satisfactory threshold.
Companies now should start taking more responsibility for third party content. This is because the algorithms that decide what content is kept and what content is removed, is often biased in favour of content that is more popular and which is possibly more profitable for social media companies.
Social media companies are therefore no longer mere conduits which host content and therefore they should take up more responsibility for the information they host.
5. What are your opinions upon the small market players? Is it possible for them to adhere to such compliances or do they need to re-invent their complete business model?
The rules make it more difficult for small market players. Intermediaries which operate in multiple countries would now be burdened with additional costs for maintaining an office and hiring employees in India. This will alter the business structure and operations of smaller companies. There are a number of such companies which usually operate with across multiple jurisdictions with ease, and some compliances under the rules will be an undue hindrance for such companies. Coupled with the added financial costs, fearing harassment or imposition of unwarranted legal liability, some entities may even choose to opt out of doing business in India.
6. The new rules have thrown up concerns related to encryption policies of messaging platforms. The new guidelines enunciate that if a messaging platform, has some prescribed number of registered users in India, it will have to enable identification of the first originator of the information. Do you believe that the new rules affect the fundamental right to freedom of speech and expression and pose privacy concerns for internet users?
Answer to question 6 has been addressed in question 1.
7. The new rule empowers the Information & Broadcasting Secretary to directly block specific content for public access, in case of an emergency. The third-tier redressal mechanism is also dealt with by the government. Moreover, the ministry will take the final call on the complaint that has been filed. Do you think that so much consolidation of power in the hands of government is right?
The Rules give overarching powers to the I&B Ministry in terms of adjudication of OTT as well as online news content. Any oversight mechanism should have a judicial body at the top and not an executive body. Giving the power to the executive to decide what content gets aired and what content gets blocked, is bound to have a chilling effect on the right to freedom of speech and expression. The ministry has not only been bestowed with the power to refer grievances to the interdepartmental committee constituted by it but it also has been given the power to issue guidance and advisories to publishers. The said committee can also send recommendations to the ministry for warning, censuring, admonishing or reprimanding an entity. This gives way to an undesirable situation where freedom of speech and the freedom of press will be at the mercy of the government or its agencies.
8. Rules are still under consideration by the Court, and the outcome is quite at a distance. So, can you please elaborate a bit upon what companies should start looking at and doing, to adhere to these guidelines now?
A couple of petitions are pending before the High Court and the Supreme Court, which challenge various provisions of the rules. Unless a company receives an interim order from a court where it is stated otherwise, it has a statutory obligation to comply with the provisions of the rules, as and when it is applicable to it.
9. Recently, an agreement has been reached by Google with French news publishers, to pay them for use of online content. This has also fueled a demand for similar enabling laws to be enacted by the Parliament of India. What route, according to you, must the Indian government take while developing the same?
It is widely accepted that social media has now become a key source of news. It has also been observed that in various parts of the world, there has been a loss of media's advertising revenue to big tech firms. Market dominance of tech firms over media organizations does create a situation where there is an inequitable distribution of power between the two entities.
Generally speaking, in the context of regulating businesses, the principle should be that no entity should unjustly or unfairly be allowed to earn profits at the cost of another entity or should earn out of the hard work of another entity. Due regard and consideration must also be paid to globally accepted anti-trust/competition law standards.
In the Indian context, any deliberation on regulation should be undertaken after a thorough assessment of the following:
i. Revenue structures of big tech companies/digital platforms and media/news companies;
ii. Interrelationship and profit sharing between the aforementioned entities.
iii. Change in trends in (i) and (ii) over the past few years and a fair prediction of trends in the future.
After such an assessment if it is found that there is an imbalance of power, stakeholders should be encouraged to come to mutually agreeable terms for profit sharing on advertising and profit from news content. It is only when these solutions do not work should the government step in to regulate.
10. As internet intermediaries have grown to play an essential role in developing, disseminating, and amplifying harmful and illegal content such as fake news and terrorismrelated content — the notion of total immunity to third party content no longer applies. How can we make sure that the internet remains as it is intended, a global entity which can be a relatively safe space for people to interact?
Intermediaries play a vital role in the digital economy. They contribute significantly in promotion of business, free flow of information and also facilitate a number of other transactions over the internet. Safe harbour protection grants immunity to intermediaries from being held liable in respect of third-party information which is hosted upon their platforms. Safe harbour protection is essential for intermediaries to be able to carry out their function without unwarranted hurdles and impediments. This is why most prospering economies around the world have robust safe harbour protection regimes in place. For development of content moderation tools and practices, instead of working in silos, intermediaries must come up with technologies by clubbing together their research efforts and by learning from collaborative engagements. Intermediaries should also be more transparent and accountable in respect of their content moderation policies. Governments on the other hand, must find the right balance which facilitates an effective enforcement of laws and ensures the protection of rights and interest of all the stakeholders. It should be wary of overregulation and should always prioritize the protection of the right to free speech online.