A perfect storm is brewing in the world of Internet platforms: in the last 10 days, Pavel Durov, the founder of messaging app Telegram has been arrested in France; in Brazil, a Judge has threatened to block X (formerly Twitter) after it refused to comply with orders, and in the middle of a hotly contested election in the US, Meta founder Mark Zuckerberg has disclosed government pressure during the Biden-Harris administration to censor content.
Durov’s arrest is for running an online platform that allows users to conduct illicit transactions, drug trafficking, money laundering and fraud, share images of child sex abuse, and the refusal to share information with authorities.
In Brazil, a Supreme Court judge has threatened to suspend X after the platform removed its legal representative tasked with handling judicial orders. This follows Judge de Moraes’ directive to remove accounts, allegedly linked to far-right activists and Bolsonaro supporters. X owner Elon Musk, who labelled de Moraes a “Brazilian Darth Vader,” is pushing back against de Moraes, citing freedom of speech. Telegram was similarly threatened with blocking in Brazil earlier, following which it hired a local representative, and in the past WhatsApp has been blocked in the country.
In the US, Zuckerberg has written about government pressure from US officials to censor certain COVID-19 posts, including satire, and admitted that Meta wrongly demoted a New York Post story about Hunter Biden before the 2020 election.
These developments highlight the tension of “Safe Harbor”, a legal principle on the basis of which platforms are treated as mere messengers, and absolved of the activities of their users. This creates challenges for governments, which leverage issues such as the usage of platforms for illegal activity such as terrorism, spreading child sexual abuse material, violence against women, and inability to conduct surveillance and gather evidence, as reasons for disempowering platforms, and enabling surveillance of our private messages. Platforms are also rife with disinformation and the co-ordinated slow-poisoning of society. However, illegal activity typically forms a minority of billions of messages and content that platforms carry every day.
This is further complicated by jurisdictional anomalies between countries. What is free speech in one country, isn’t in another: it might be illegal to support the LGBTQI movement in Saudi Arabia, but not in India. All governments operate on a spectrum of authoritarianism; in some countries, courts and even Supreme Courts, are politically compromised. The freedom to conduct and coordinate political activities is just as important as being able to discuss your medical problem with a loved one over WhatsApp or share your location with a friend, privately. At the same time, we run the risk of platforms acting politically, whether by choice or via government pressure, to surveille users or suppress certain speech. It is clear that platform decisions are susceptible to pressure and may not always act in the best interest of society.
Governments, including in the EU, US and India with its IT Rules 2021 and the proposed Digital India Act, are empowering themselves against platforms. There has been judicial overreach from courts ordering blocking of content extraterritorially, in India and New Zealand. In the balance of power between governments, platforms and users, the equation keeps shifting between government and platforms, with users having little power because of lack of accountability of both parties. In India, for example, the government of India enforces with impunity, secretive censorship using Section 69A of the IT Act, accountable only to itself.
Just as we cannot risk allowing platforms too much power, we also cannot allow governments too much power over them. The carriers of our speech need to be protected as enablers of speech, but we also need protection from their unilateral actions. This becomes tricky, especially in the US, where the Supreme Court has pushed back against attempts at regulating the platform’s ability to moderate and prioritise content, saying that these actions may be protected as an exercise of free speech.
There is no single or comprehensive solution to this problem: the balance of power will keep shifting between governments and platforms. There will be unintended consequences, and calculated omissions and commissions from both. To empower users, and tilt the balance of power towards them, we need to regulate those with power. An establishment of principles at a global level might help address jurisdictional overreach, and ring-fence decisions by courts. There needs to be a rule-based and unambiguous approach towards regulating speech on platforms, rooted in transparency and accountability, to close the gap between responsibility and accountability of both platforms and governments.
Given the billions of users on online platforms, we will have more moments of reckoning for freedom of speech: we need to ensure that these aren’t used to disempower legal speech and our privacy.
An edited version of this op-ed was published in Times of India