Dissolved Encryption Standards: Impact on Trade and Businesses
(This blog post is fourth in series of “Encryption and human rights” series by SFLC.in.)
In today’s digital age, encryption is not confined to communications. It has become a cornerstone for digital security and is used all over the word for smallest of tasks like browsing the internet, making online purchases, to carry out financial transactions. All users of digital services use encryption on a daily basis. From communication to cloud storage to the banking industry, encryption forms the basis of their secure reliable existence in the digital space.
Recently, Apple got into controversy by announcing that it would be introducing filters to detect Child Sexual Abuse Material (CSAM) on iCloud. Experts have labelled this step as a misstep and have urged Apple to put a halt on these changes. The experts in Cryptography and civil society organizations have warned about the impact this could have on end-to-end encryption. India has also introduced the traceability provision which in effect breaks end-to-end encryption. This has also put the future of companies offering these services in jeopardy leaving them with the choice of either complying with the traceability provision and breaking E2E, or to put a halt to their services. This has been particularly challenging for messaging applications run by not-for-profit organizations, or are federated in nature. While not all end-to-end messaging services fall under the category of significant social media intermediaries, they can be required to comply with the provisions of Rule 4 by an executive notification under Rule 6.
The Governments across the world have been pushing against encryption and encrypted services citing dissemination of CSAM, terrorism, fake news amongst other reasons. Unfortunately, most of these claims are not backed by data or empirical research.
The Part II of the Rules, 2021 governs the due diligence requirements to be complied by intermediaries. Rule 4 of the Rules, 2021 lays down certain additional due diligence requirements which have to followed by the Significant Social Media Intermediaries (hereinafter “the SSMIs”). The threshold for a social media intermediary to be categorised as an SSMI is to have more than 50 lakh registered users in India. As of now, none of the federated FOSS services have been categorised as a SSMI. However, Signal i.e. a centralized FOSS end-to-end encrypted messaging service run by a not-for-profit organization has been categorized as an SSMI.
Rule 4 of the Rules, 2021 empower the Central Government to notify through Rule 6 any social media intermediary as an SSMI. Thus, even federated FOSS services with a smaller user base can be notified to be a Significant Social Media Intermediary under this provision. The Central Government, through an executive notification, can require such services to comply with one or more provisions of the Rule 4. Rule 4 requires the SSMIs to establish a physical office in India, appoint various personnel, to introduce automated filtering and to incorporate the traceability provision.
In this blog post, we talk about the impact of Part II of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereinafter “the Rules) will have on federated FOSS services and FOSS volunteers, and on small not-for-profit services.
-
Impact on federated FOSS services run by FOSS volunteers
Contrary to the for business or for profit companies, the FOSS community across the world comprises FOSS enthusiasts, technologists contributing in their individual communities. Small not for-profit organizations such as Signal, Tor, OONI projects being a few of them. The community offers its services via crowd-funding by public and individuals interested in safeguarding their privacy. These FOSS services provide an alternative to proprietary social media applications give users a secure and private means of communication. They do not retain large amount of metadata or surveil upon their users unlike most of the for-profit proprietary companies offering similar services.
The Rules, 2021 have applied a one-size fit all approach on companies offering services in messaging and communications. They have failed to take into account that several of these services have been built on top of protocols and are decentralized in nature. For instance, Matrix which is an open standard and communication protocol for real time communication is federated in nature. It is run by several FOSS volunteers who also maintain various servers. A case has been filed in the High Court of Kerala by a FOSS developer, Praveen Arimbrothidiyil challenging the impact of the Rules, 2021 on federated FOSS services. Praveen is being assisted by lawyers from SFLC.in the matter.
Since these services are located at various geographical locations across the world, and the servers interact with other to ensure interoperability, the changes in one server based in India will mean its ineffectiveness in interacting with other servers which would not be required to comply with the Rules, 2021. These services would find it financially and resource-wise unviable to continue operations of their servers which have a small user base.
It is also unclear if the Rules would be applicable on independent servers or on the entire platform. For instance, Matrix is an end-to-end encrypted open source messaging platform which is federated in nature. Being a federated service many servers are hosted by people or entities or organizations. They would also face challenges in maintaining the metadata trail of communications as other servers hosted by residents or citizens of other countries cannot be compelled to share metadata trail of communications.
This would severely impinge on the right of trade and profession as guaranteed by Article 19(1)(g) of the Constitution of India of FOSS volunteers.
-
Impact on small not-for-profit organizations
Small not-for-profit organizations like Signal which run the popular Signal messaging application fall under the category of SSMIs. Signal has a user base of more than 50 lakh registered users in India. It is free and open source and end-to-end encrypted. Signal and other not-for-profit services do not monetarily benefit from the service it provides. Incorporating the traceability requirement, appointment of officers in India, setting up a physical office in India, and introducing automated filters would prove to be financially infeasible for such not-for-profit organizations.
-
Automated filtering under Rule 4(4) and its impact on end-to-end encryption
Rule 4(4) makes it de-facto mandatory for significant social media intermediaries to deploy automated tools for proactive identification of any act or simulation of explicit or implicit rape or child sexual abuse or conduct, by the use of the phrase “shall endeavour”. The Rule 4(4) is non-exhaustive in nature as it is limited to only rape and child sexual abuse material such as pornographic content which would amount to offences of similar nature as listed under the Indian penal code.
This also means that the SSMIs offering end-to-end encrypted messaging services will have to comply this provision. Such client side scanning has been known to break end-to-end encrypted communications. The United National Special Rapporteur’s report on India’s Rules, 2021 also states that the automated filters violate due process and put the burden of censorship on intermediaries. It states that “general monitoring obligation that will lead to monitoring and filtering of user-generated content at the point of upload … would enable the blocking of content without any form of due process even before it is published, reversing the well-established presumption that States, not individuals, bear the burden of justifying restrictions on freedom of expression.”
This Rule shall not only impact the right to trade and business of SSMIs but will also severely impinge on privacy and anonymity of citizens, as well as on the right to speech and expression. This shall be particularly challenging for FOSS services and volunteers as they have limited resources to incorporate these provisions in their systems.
The first proviso to Rule 4(4) mandates that the measures taken by the intermediary have to be proportionate in nature having regard to the interests of free speech and expression and privacy of users. It disregards the fact that automated filters are not yet sophisticated enough to differentiate between what would be child sexual abuse material and journalistic reporting. For instance, the automated filters of Facebook once took down the iconic Napam girl picture clicked during the Vietnam war. The picture showed a naked girl child running away from a chemical bomb. Facebook’s algorithm, however, construed it as violative of community guidelines and took down the picture. This would also rope in inherent societal biases in the system.
-
Conclusion
The Rules, 2021 have adopted a one-size fits all approach which has put the future of not-for-profit social media intermediaries and FOSS intermediaries into jeopardy. Instead, there must be a case wise approach to curb the challenges the Rules intend to address. These FOSS services and small not-for-profit services do not have the resources to incorporate and adhere with the provisions laid down in the Rules, 2021. The Rules, 2021 do not draw any intelligible differential between for profit proprietary services and not for profit federated FOSS services.
A robust stakeholder consultation mechanism would have had helped in addressing these challenges. Unfortunately, the Rules, 2021 were notified without adequate consultation, and therefore, suffer from major challenges which encroach on fundamental right to trade and profession, privacy, and speech and expression.
SOCIAL MEDIA