Assessing the proposed amendments to the ICT Act to regulate social media in democratic Mauritius

CHRISTINA CHAN-MEETOO

- Publicité -

Senior Lecturer in Media and Communication

Head of Mediacom Studio

University of Mauritius

CHRISTINA CHAN-MEETOO
Senior Lecturer in Media and Communication
Head of Mediacom Studio
University of Mauritius

On 14 April 2021, the Information and Communication Technologies Authority (ICTA) released a Consultation Paper on proposed amendments to the ICT Act for regulating the use and addressing the abuse and misuse of Social Media in Mauritius.

What are the proposed amendments to the ICT Act?

ICTA is proposing the creation of:

  • ● a National Digital Ethics Committee (NDEC) as the decision-making body on the contents. The key mandate of the NDEC would be to: « investigate on illegal and harmful content on its own or through interaction with other stakeholders already involved in national security, crime investigation, detection and prevention or through complaints received; »
  • ● a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.
  • ● a  technical toolset to be deployed whereby internet users in Mauritius will have to use a local proxy server as intermediary and will be asked to install a self-signed digital certificate allowing them to be accepted as intermediary by the browser or the app, thereby bypassing the https protocol. The stated objective is to allow the proxy server to:
    • ○ segregate traffic to and from social media sites,
    • ○ decrypt such information
    • ○ analyse and store data for investigations purposes

About the context for the proposed amendments

The stated objective is to combat illegal and harmful content in a way that is not dependent on international social media companies who are not sufficiently responsive to requests according to the authorities. For instance, ICTA says that Facebook takes too much time to respond, that community standards are not as strict as our domestic laws, and that platform moderators do not understand Mauritian Creole language. The paper states that a “minority of individuals or organised groups” are at fault and that “The issue at hand is when these abuses, even though perpetrated by few individuals/groups, go viral, the damage created is very far reaching.”

The consultation paper includes a table with the number of incidents reported to the Mauritian Cybercrime Online Reporting System (MAUCORS). But, more detailed information is needed to gauge the “very far reaching damage” created by « illegal and harmful » content and whether existing mechanisms could be reinforced instead of having recourse to the technical toolset proposed. We need an objective evaluation of the proportionality of the proposed mechanism with respect to the extent of the problem to be addressed. The key question being: How big is the problem of misuse and abuse of social media really in the country that warrants such far-reaching measures?

What does the adjective « harmful » mean?

Should such a proposal be implemented, the adjective “harmful” should be strictly defined to list all types of content classified as such with unambiguous criteria to be used so that the “rules of the game” can be clearly spelt out in advance. Else, there is an inherent risk of abuse and misuse by authorities and parties involved in the mechanism. It cannot be open to interpretation, bearing in mind that the principle of potential “alternance” of parties in power is still valid in Mauritius and that there is always a risk of petty power games as a result.

Potential conflict with Section 12 of the Constitution

More importantly, the main issue with this proposal is a potential conflict with section 12 of the Constitution as highlighted in the paper itself.

This section states that there can be no interference in our communications except with our consent and except for specific provisions involving public defence, safety, order, morality and health, reputation, privacy, courts authority, technical administration of telecoms and civil service restrictions. BUT, the last line puts everything in perspective as the provisions should be “justifiably reasonable in a democratic society”. Unfortunately, the proposed amendments to the ICT Act do not seem “justifiably reasonable in a democratic society“.

The proposal is tantamount to blanket surveillance on the citizens of Mauritius without judicial oversight. No warrant would be needed to spy on specific individuals. ALL citizens’ communication would be interfered with in a blanket manner. The proportionality test is here crucial to determine whether such a surveillance system is really warranted in a democratic society like ours. Does this pass the balance of risks/benefits? Fundamentally, this proposal seems to contradict the philosophy of the Data Protection Act which aims to protect citizens’ private data and which was adopted in 2017 (and modelled on the EU GDPR).

How are other democratic states trying to tackle abuse and misuse of social media?

We do not know of any other democratic state which is using such a system as the one proposed by the ICTA. This would be a first in the world, in the negative sense of the word. There is thus a high probability that Mauritius will tumble down in world freedom rankings.

In all the democratic states cited in the consultation paper, the responsibility for moderation and removal of illegal content remains vested with the social media platforms.

In Germany, the Network Enforcement Act has forced platforms such as Facebook to employ many moderators for the country and the law provides that the data should be stored by the platforms for only up to 10 weeks.

In the UK, Ofcom has been tasked with the regulation of social media and the Chief Executive states on the official website that “We won’t censor the web or social media. Free expression is the lifeblood of the internet and it’s central to our democracy, values and modern society.” and “We won’t be responsible for regulating or moderating individual pieces of online content. The Government’s intention is that online platforms should have appropriate systems and processes in place to protect users; and that Ofcom should take action against them if they fall short. We’ll focus particular attention on tackling the most serious harms, including illegal content and harms affecting children.“

India has recently introduced new Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules which require “significant” social media intermediaries to have a local representation with compliance and grievance officers and the platforms would need to publish regular reports about compliance with a Code of Ethics.

The key takeaway here is that State agencies in democratic countries are not endowing themselves with any technical capacity to interfere, intercept and remove any content on social media in comparison with the Mauritian ICTA proposal.

One can assume that the mechanism for intercepting, decrypting and re-encrypting social media traffic requires enormous technical capabilities given the enormous amounts of data (photos, videos, livestreams, etc.) generated daily. It is also very much a moving target because nothing can prevent any social media network from changing some implementation details which might make the system ineffective. Also, anyone can bypass the system just by using a Virtual Private Network (VPN).

Other key questions posed by the proposed amendments

    • ● Will all social media be concerned? Will there be an official list with regular updates?
    • ● Would the filtering, analysis and classification be done by humans or by software? If so, the code needs to be open to scrutiny.
    • ● What about the potential risk for middleware to be tampered with (by subcontractors for example)?
    • ● What will be the composition of the National Digital Ethics Committee? What criteria will be used to designate the chairperson and the members?
    • ● What will be their expected volume of work (given that there are enormous amounts of content being created everyday)?
    • ● What objective criteria will be used for assessing content?
    • ● Will there be hearings and possibility of appeals in case an individual or group is suspected of creating and disseminating “illegal and harmful content”?
    • ● What will be the safeguards against misuse and abuse of the mechanism?
    • ● Is there sufficient technical capacity to handle the volume of data generated and shared on social media on the national level? What about potential issues with quality of service on the network which may affect the digital sector of the economy as well as all operations of non-digital firms which use digital tools to effect transactions, promote their products and services, handle internal processes and information?
    • ● Will the digital certificate installation be compulsory? How? What if people refuse? What if people resort to VPNs?
    • ● Have other solutions been explored? How far? Could the analysis be made public? How about strengthening local investigation capacity?

What are potential alternatives to the proposed amendments?

    • ● The last question in the paper asks citizens if they think that local courts of justice should be empowered to “impose sentences (which include banning use of social media) on persons convicted of offences relating to misuse of social media tools”. This may indeed be explored.
    • ● Preventive measures such as education, sensitisation and digital literacy remain crucial to create a more civic space online.
    • ● We should also explore partnerships at the regional level to negotiate with the social media platforms in order to improve their responsiveness to requests from local authorities (eg. AU, SADC).

Conclusion

It is a positive sign that this is at the stage of a public consultation, thus requiring inputs from Mauritian citizens as individuals or groups, hopefully shaping the outcome of the final decision to amend legislation or not. In the same spirit, the next phase of the process should be transparent, that is, it is expected that there be a public report about inputs received and how these inputs are tackled by the ICTA and any other authority involved in the process.

On the basis of information provided so far, the present proposed amendments do not seem reasonable in terms of the proportionality of the measures with respect to the problem to be tackled.

- Publicité -
EN CONTINU

l'édition du jour