+
 
For the best experience, open
m.thewire.in
on your mobile browser or Download our App.

Telegram App CEO Pavel Durov Arrested: A Warning Sign for Global Tech Giants

The arrest might be a warning sign for big global tech giants who are blamed for their unregulated social media platforms, while also being accused of censorship of free speech.
Pavel Durov, Telegram Co-Founder and CEO. Photo: Wikimedia Commons/TechCrunch/CC BY 2.0

When Pavel Durov arrived in France on his private jet last Saturday, he was greeted by police who promptly arrested him. As the founder of the direct messaging platform Telegram, he was accused of facilitating the widespread crimes committed on it.

Support Free & Independent Journalism

Good morning, we need your help!!

Since May 2015, The Wire has been committed to the truth and presenting you with journalism that is fearless, truthful, and independent. Over the years there have been many attempts to throttle our reporting by way of lawsuits, FIRs and other strong arm tactics. It is your support that has kept independent journalism and free press alive in India.

If we raise funds from 2500 readers every month we will be able to pay salaries on time and keep our lights on. What you get is fearless journalism in your corner. It is that simple.

Contributions as little as ₹ 200 a month or ₹ 2500 a year keeps us going. Think of it as a subscription to the truth. We hope you stand with us and support us.

The following day, a French judge extended Durov’s initial period of detention, allowing police to detain him for up to 96 hours.

Telegram has rejected the allegations against Durov. In a statement, the company said: “It is absurd to claim that a platform or its owner are responsible for abuse of that platform”.

The case may have far-reaching international implications, not just for Telegram but for other global technology giants as well.

Who is Pavel Durov?

Born in Russia in 1984, Pavel Durov also has French citizenship. This might explain why he felt free to travel despite his app’s role in the Russia-Ukraine War and its widespread use by extremist groups and criminals more generally.

Durov started an earlier social media site, VKontakte, in 2006, which remains very popular in Russia. However, a dispute with how the new owners of the site were operating it led to him leaving the company in 2014.

It was shortly before this that Durov created Telegram. This platform provides both the means for communication and exchange as well as the protection of encryption that makes crimes harder to track and tackle than ever before. But that same protection also enables people to resist authoritarian governments that seek to prevent dissent or protest.

Durov also has connections with famed tech figures Elon Musk and Mark Zuckerberg, and enjoys broad support in the vocally libertarian tech community. But his platform is no stranger to legal challenges – even in his birth country.

An odd target

Pavel Durov is in some ways an odd target for French authorities.

Meta’s WhatsApp messenger app is also encrypted and boasts three times as many users, while X’s provocations for hate speech and other problematic content are unrepentantly public and increasingly widespread.

There is also no suggestion that Durov himself was engaged with making any illegal content. Instead, he is accused of indirectly facilitating illegal content by maintaining the app in the first place.

However, Durov’s unique background might go some way to suggest why he was taken in.

Unlike other major tech players, he lacks US citizenship. He hails from a country with a chequered past of internet activity – and a diminished diplomatic standing globally thanks to its war against Ukraine.

His app is large enough to be a global presence. But simultaneously it is not large enough to have the limitless legal resources of major players such as Meta.

Combined, these factors make him a more accessible target to test the enforcement of expanding regulatory frameworks.

A question of moderation

Durov’s arrest marks another act in the often confusing and contradictory negotiation of how much responsibility platforms shoulder for the content on their sites.

These platforms, which include direct messaging platforms such as Telegram and WhatsApp but also broader services such as those offered by Meta’s Facebook and Musk’s X, operate across the globe.

As such, they contend with a wide variety of legal environments.

This means any restriction put on a platform ultimately affects its services everywhere in the world – complicating and frequently preventing regulation.

On one side, there is a push to either hold the platforms responsible for illegal content or to provide details on the users that post it.

In Russia, Telegram itself was under pressure to provide names of protesters organising through its app to protest the war against Ukraine.

Conversely, freedom of speech advocates have fought against users being banned from platforms. Meanwhile, political commentators cry foul of being “censored” for their political views.

Also read: Telegram, Under the Radar, Remains a Key Source of Hindutva Hate Speech

These contradictions make regulation difficult to craft, while the platforms’ global nature make enforcement a daunting challenge. This challenge tends to play in platforms’ favour, as they can exercise a relatively strong sense of platform sovereignty in how they decide to operate and develop.

But these complications can obscure the ways platforms can operate directly as deliberate influencers of public opinion and even publishers of their own content.

To take one example, both Google and Facebook took advantage of their central place in the information economy to advertise politically orientated content to resist the development and implementation of Australia’s News Media Bargaining Code.

The platforms’ construction also directly influences what content can appear and what content is recommended – and hate speech can mark an opportunity for clicks and screen time.

Now, pressure is increasing to hold platforms responsible for how they moderate their users and content. In Europe, recent regulation such as the Media Freedom Act aims to prevent platforms from arbitrarily deleting or banning news producers and their content, while the Digital Services Act requires that these platforms provide mechanisms for removing illegal material.

Australia has its own Online Safety Act to prevent harms through platforms, though the recent case involving X reveals that its capacity may be quite limited.

The European Union is making content moderation the responsibility of tech platforms. (Representative Image Via Wikimedia Commons/Håkan Dahlström/CC BY 2.0)

Future implications

Durov is currently only being detained, and it remains to be seen what, if anything, will happen to him in the coming days.

But if he is charged and successfully prosecuted, it could lay the groundwork for France to take wider actions against not only tech platforms, but also their owners. It could also embolden nations around the world – in the West and beyond – to undertake their own investigations.

In turn, it may also make tech platforms think far more seriously about the criminal content they host.

Timothy Koskie is a Post-Doctoral Associate for the Mediated Trust project in the School of Media and Communications at the University of Sydney. Prior to that, he was working with the UTS Centre for Media Transition, working on projects that include the Valuing News and Wikihistories Discovery projects and the Implications of Generative AI for knowledge integrity on Wikipedia.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Make a contribution to Independent Journalism
facebook twitter