Within the past three years, Telegram has become the most important platform for conspiracy ideologues and right-wing extremists in German-speaking countries. Using this messaging service, numerous channels reach hundreds of thousands of people every day – and for some messages, the number of views is in the millions.
The messaging service was used, for example, to organize the protests of the so-called “Querdenken” movement during the COVID-19 pandemic and to spread conspiracy narratives about vaccination. The pro-Russian disinformation shared on the platform manages to exert deep influence on political discourses and attitudes far beyond the boundaries of online chat records. Plans for coups and terrorist acts have been openly discussed, and antisemitic content disseminated en masse.
Several “alternative media” outlets regularly reach six-figure audiences via Telegram, allowing them to fund even larger editorial teams. State sanctions are circumvented by providing direct financial support to pro-Russian disinformation influencers via Telegram. Currently, Telegram and its founder Pavel Durov are expanding these possibilities even further: Telegram will be used in the long term not only for dissemination, but also for international financing of disinformation and conspiracy ideologies.
The belief persists that Telegram does not respond to government requests and does not delete any content. This is a misconception. Past incidents have shown that Telegram has in fact responded when the pressure became too great and, for example, fears emerged concerning a potential removal from the app stores of Google and Apple - but Telegram’s action has so far only been haphazard and unsystematic.
How did Telegram become the most important platform for conspiracy ideologues and right-wing extremists within such a short period of time? Why has the platform been able to evade almost every attempt at regulation so far – and when has it been forced to act after all? Is it that Telegram cannot, or will not, do anything about the hate on its platform? Who are currently the most relevant actors on the platform and how many people do they reach? What can we expect from the messaging service in the future?
In this new CeMAS Digital Report, we drill down on these questions.
Communication on Telegram is possible directly via direct messages, but also through so-called channels and groups. These channels and groups played a major role in the networking of the conspiracy-ideological and far-right scene. Channels can be subscribed to by Telegram users. If the channel operator writes a message, all subscribers receive this notification directly. Communication here is largely unidirectional, although in some cases, channels enable the option of commenting on the channel’s posts or reacting to them via emojis. A channel’s size is unlimited.
In addition to channels, communication also takes place in groups. In groups, users can chat with other users and also share messages (e.g. from channels). Many of the groups on Telegram have a regional connection. Groups have a maximum size of 200,000 members. Unlike other social networks, Telegram does not use a so-called newsfeed, i.e. a start page tailored to the individual user which aggregates the posts of the channels. Telegram only shows the most recent communications in channels and groups in a chronological order. Channel operators are therefore particularly dependent on their messages being discussed frequently and topically in many other groups and shared via other channels.
This creates a messaging system with unique characteristics in dealing with content moderation and deletion: If a channel is deleted, there may still be many groups in which the content of a successor channel is shared. These particular features of Telegram lead the Identitarian movement to decide to create a decentralized network on the platform through the use of local groups in 2018. This system is later adopted by other actors.
In 2018 and 2019, the milieu’s German-language channels grow slowly but steadily. Oliver Janich and a channel of the Identitarian Movement reach about 40,000 subscribers by the end of 2019 - they are by far the widest-reaching channels of this period. In December 2019, Janich’s messages reach about 13,700 views per message.
2020: During the pandemic, the reach on Telegram explodes.
With the onset of the COVID-19 pandemic, groups quickly emerge on established social networks, such as Facebook, in which the pandemic is reinterpreted in conspiracy ideology terms: the virus is denied, containment measures are rejected, and a personified image of the enemy is constructed around the effects of the pandemic. Many platforms come under pressure to curb disinformation and conspiracy narratives on their networks.
Following a similar principle to that of the Identitarian movement, on April 24, 2020, the Facebook group Corona Rebellen (“Corona Rebels”), which at the time had around 60,000 members, decides to set up local and decentralized groups on Telegram. In doing so, they also aim to pre-empt a possible ban by Facebook.
The Telegram groups and channels have names like Corona Rebellen Düsseldorf, Nicht ohne uns Wesel (“Not Without Us - Wesel”) and Querdenken 711 Stuttgart.
With the onset of the pandemic, a large part of the conspiracy-ideological and far-right scene concentrate their communication on Telegram. Hundreds of channels and groups are created, and channels that were already present on the platform before the pandemic suddenly reach hundreds of thousands on their platform.
2021: New “alternative” media outlets establish themselves on the scene.
As the pandemic progresses, numerous “alternative media” portals are founded in which permanent editorial teams regularly publish new and professional-looking content. Based on their own statements, they often finance themselves through donations from the scene, but also through product and advertising placements on associated websites and Telegram channels.
Wochenblick.at, Infodirect, AUF1, Boris Reitschuster and RT DE become permanent fixtures on the conspiracy-ideological scene. In 2021, many of these “alternative media” have managed to build up a large audience and use it to spread numerous conspiracy narratives and disinformation.
2022: Year of the Pro-Russian disinformation influencer
The start of Russia’s war of aggression in Ukraine in February 2022 also changes the information landscape on Telegram.
In just three years, Telegram has become one of the most important platforms for the conspiracy-ideological and far-right milieu. During the pandemic, but also beyond, various actors were able to build up a network on the platform in which they can regularly reach six-figure views. This large audience, but also the numerous functions and extensions of the platforms, led to the professionalization of many of the actors presented here. Through their dissemination of conspiracy narratives, disinformation, right-wing extremism and antisemitism, they are now able to raise enough money through advertising or donations to pay editorial staff in some cases.
A conspiracy-ideological and far-right universe has formed on Telegram.
Contrary to the hopes of right-wing extremists and conspiracy ideologues, moderation also takes place on Telegram in the form of restrictions and limitations on accounts, channels and groups – albeit to a much lesser extent than on other social networks.
Telegram founder Pavel Durov often implied that one of Telegram’s founding ideas was to create a platform that would never respond to government requests. A platform that would set freedom of speech as its highest edict.
This principle broke with reality more often than not: the company’s history shows that Telegram has in fact responded to requests and ban inquiries – Telegram has come to the table in several countries, even with autocracies.
The platform has resorted to various content moderation options in the past.
Measures range from blocking channels on certain devices (such as Apple devices), to restricting access from certain regions (e.g. blocking European phone numbers), to deleting accounts, content, channels, and groups altogether.
During the pandemic, the platform came under increasing pressure due to criticism for not doing anything against the hate and numerous calls for violence on the platform – pressure that may have led Telegram to take stronger action on its platform after all.
Telegram’s biggest fear is being removed from the app stores. For this reason, Telegram blocks channels and groups on Apple devices particularly quickly if pornographic content is shown or copyrights are violated.
In the case of antisemitic and violent statements, on the other hand, the company often acts only haphazardly and unsystematically.
This is particularly obvious in the case of the antisemite Attila Hildmann.
The former cookbook author was a frequent guest on cooking shows and entertainment television before the pandemic. With the onset of the pandemic, he spread numerous conspiracy narratives and increasingly used explicit antisemitic vocabulary. These appearances were more frequently thematized in the German media.
He was able to avoid a planned arrest in February 2021 because he fled to Turkey in December 2020. Since then, he has been spreading his antisemitic posts from there.
Artificial Intelligence for the detection of antisemitic text content in German Telegram messages
To allow for a faster detection and large-scale analysis of antisemitism in German Telegram messages, CeMAS collaborated with Rewire to develop an artificial intelligence (AI) tool for automatic antisemitic text content detection. In this chapter we describe the development process and investigate the strengths and weaknesses of the AI model.
The foundation: neural networks
The foundation of the AI model is a state-of-the-art neural language model for German. This model was pre-trained for general language use with the help of multiple billions of words from publicly available text data, and subsequently fine-tuned to distinguish between antisemitic and not antisemitic text content with custom training samples.
Step 1: Collecting training data
The training data for the fine-tuning step was collected from German Telegram channels and group chats, which had been selected by CeMAS for potentially antisemitic content. From these resources we created two data sets: one with a random selection of messages from the years 2020-2022, and one with messages that contain specific search terms.
The search terms used for the second sampling technique represent three different types of content
- antisemitic slurs
- terms and expressions often used in the context of or associated with antisemitic narratives (e.g. ‘Gelber Stern’ (yellow star), ‘Konzentrationslager’ (concentration camp’), ‘Rothschild’, ‘Zionist’), or
- terms and expressions from the context of Jewish culture, history or religion (e.g. ‘Bar Mizwa’, ‘Heiliges Land’ (holy country’), ‘Jüdin’ (Jew [female]), ‘Shalom’)
While the randomly selected messages predominantly contain domain-specific but thematically irrelevant und innocuous content, the search results were expected to have a higher likelihood of containing antisemitic and other thematically relevant content. When training an AI model, it is important to present both the antisemitic messages to be detected as well as a representative sample of the distribution of the input data. We also manually constructed contrasting counter examples representing the innocuous use of thematically relevant terms and expressions.
All training samples were automatically pre-processed and split into text segments of a maximum length of 128 words. This simplifies the annotation process and improves processing by the classification model.
Step 2: annotation of the training data
A total of over 5,000 text samples were analysed and labelled for potential antisemitic content by a number of expert annotators. Annotations were based on the working definition of antisemitism developed by the International Holocaust Remembrance Alliance (IHRA), which was revised and extended with example texts by CeMAS and Rewire.
We carefully monitored the annotation process to guarantee a consistent and high-quality annotation of the training data. The quality of the training data has a direct impact on the AI model’s accuracy in distinguishing antisemitic from not antisemitic content.
Step 3: model training
We augmented the annotated data with additional samples from a selection of pre-labelled resources, and then shuffled and split the resulting data into four data sets for model development.
The largest portion of samples was assigned to a training set which was used to fine-tune the German language model for distinguishing antisemitic from not antisemitic content. Simply speaking, in this step the AI model is repeatedly shown text segments with their corresponding label (antisemitic or not antisemitic). Through this process, the model learns to weigh the relevance of certain text features - and ultimately to predict the correct label for a given input. In predicting labels, the neural model does not only rely on extracting word-level features, but is able to incorporate the entire semantic context of an input segment.
Besides the training set, we distributed the remaining samples into smaller validation, evaluation and test sets. As indicated by their names, these data sets are used during the development and testing of the model, or for a final evaluation of the AI classifier.
Step 4: model evaluation
To evaluate the AI model, we held back a total of 700 annotated text segments. 400 of these samples were sampled randomly from our Telegram resource, and 300 were selected by keyword search. 220 (31.4%) of the evaluation samples were labelled as antisemitic by our annotators.
The developed model has a detection rate (or recall) of 86.4% of these antisemitic samples, at a precision of 82.3%. Together, this produces an accuracy of 89.9% on the evaluation set (macro-averaged F1 of 88.4 points).
Strengths and weaknesses of the AI model
A manual inspection of the test data revealed that an incorrect classification of messages with conspiracy theories related to topics like New World Order (NWO), Kabbalah and Satanism is one of the largest error modes of our AI model. In these cases the AI classifier often assigns the label antisemitic, which in some cases does not correspond to our definition of antisemitism detailed in our annotation guidelines.
A second error mode - which is very typical for a neural model like this - is the incorrect classification of extremely rare contents, which were not represented in the training data. The neologism Coronacaust for example is to be considered clearly antisemitic according to our annotation guidelines, but was incorrectly labelled as not antisemitic by the model. To address this limitation, the AI could be adjusted with so-called adversarial samples in future projects.
References
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I. (2017). Attention is all you need. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, & R. Garnett (Hrsg.), Advances in neural information processing systems (Bd. 30). Curran Associates, Inc. https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
Wich, M., Gorniak, A., Eder, T., Bartmann, D., Çakici, B. E., & Groh, G. (2022). Introducing an Abusive Language Classification Framework for Telegram to Investigate the German Hater Community. Proceedings of the International AAAI Conference on Web and Social Media, 16, 1133–1144. https://doi.org/10.1609/icwsm.v16i1.19364
The antisemitism and calls for violence in Hildmann’s groups – as in many other places on the platform – can hardly be denied.
Using a machine-learning approach developed in partnership with Rewire, CeMAS was able to automatically capture thousands of antisemitic messages in tens of thousands of comments posted in Hildmann’s groups.
Telegram could also use such methods to identify channels and groups on which there are calls for violence and group-based misanthropy, and then review them. Their systems would need to detect when second and third channels are created following bans.
Telegram does react. However, it only does so when the pressure from outside is great enough and the attention on the hateful rhetoric being spread on the platform is enough to risk the Telegram app being removed from the app stores of Google and Apple.
In the past, this threat has led Telegram to at least partly abandon its usual strategy for government requests – ignoring them – and to take a public stance and even engage in measures such as content moderation. For a look at the company’s past and how Pavel Durov’s team has dealt with demands from state actors, see a detailed guest post by Darren Loucaides.
Who Runs Telegram?
Between Ideology and Pragmatism as a Core Element of Corporate Strategy.
With a user base of 700 million, Telegram has an enormous responsibility as a global messaging service that also functions as a social network. Every platform that approaches Telegram’s scale can be expected to establish measures to curb problematic content and developments on its platform—be it spam, copyright infringements, or dangerous calls to violence. In numerous jurisdictions, accusations have been fired at Telegram over the years for not living up to this responsibility.
Telegram espouses free speech and eschews so-called censorship by governments and authorities. This appears to be borne out by a relatively light approach to moderation. Either due to being thinly staffed, or for ideological reasons—or all of the above—Telegram tends to be slow to respond to accusations and to take down content defined as illegal in its own terms of service. Telegram has also been confronted by the highest courts in several jurisdictions; on more than one occasion, it has not taken action to curb illegal content until governments and supreme courts threaten bans.
This chapter will provide an overview of how Telegram has dealt with requests from authorities in selected jurisdictions. It will then look at why Telegram’s response is the way it is, and whether this is likely to change. To understand this last point, we have to understand what Telegram is really like as an organisation—something that is not, as will become apparent, particularly easy to understand.
The team
Telegram was launched in 2013 by Pavel Durov alongside his brother Nikolai, an award-winning mathematician and computer scientist. In 2006, Durov first cofounded VKontakte, a Facebook lookalike catering predominantly to Russian-speaking markets. When he lost control of the company, he claimed that he was forced out by Kremlin allies. Since then, Durov has portrayed himself as an opponent of authoritarian governments, and defender of digital privacy and security. He self-describes as a libertarian who passionately believes in free speech and skirting censorship. The founding story of Telegram revolves around the idea of wanting to have a messaging service that ensures both free-speech and security from state surveillance and oppression. But Durov’s position vis-à-vis the authorities starts to look more ambivalent when examined up close. It seems, in reality, to depend on a case-by-case determination.
Durov, 38, was born in St Petersburg, Russia’s second city. Almost since launching Telegram, Durov has portrayed himself as an exile from Russia, forced to leave his home country under pressure from the Kremlin. But sources say he frequented Telegram’s former offices in St Petersburg until at least 2017, and that there was nothing preventing Durov coming and going from Russia, for example to visit family. Durov and Telegram are now officially based in Dubai, and Durov appears to have good relations with the autocratic rulers of the United Arab Emirates.
Durov hasn’t made a public appearance for several years, and his last interview to western media was in 2017. Durov favours communicating via his public channel on Telegram. Partly as a result, the running of Telegram is for all intents and purposes a black box. Staff members never talk to journalists, except for occasional comments by the vice-president and the company spokesperson. Little is known about the core team of developers, though all are believed to be Russian speakers. Telegram states that there are currently about 60 employees. It is not clear how many employees are dedicated to moderating content.
Patchy record
Telegram’s record in tackling illegal content is patchy. While its rival Meta has come under greater public scrutiny from western media and legislators for moderation failings, Telegram has also been pressured by the authorities in multiple jurisdictions. The most high-profile recent examples are that of the German government threatening to ban Telegram over extreme content and terrorist plotting on the app, and Brazil’s supreme court suspending the app for not complying with requests, both of which are discussed below.
In Telegram’s terms of service, the company commits to taking down illegal pornography and calls to violence. Telegram also purports to prohibit copyrighted content and doxxing. But when it comes to responding to requests from authorities about dangerous content on the app, the way Telegram acts varies wildly. In terms of the company’s response to users reporting illegal content – which can be done by clicking on an offending post – Telegram is generally slow to take action according to researchers and civil groups tracking extremist activity on the platform.
In 2015, Telegram came under international pressure for ISIS’s use of the app, particularly in the wake of the Paris terror attacks. Until then, Telegram’s actions to limit Islamist terrorist-related activity on the platform was limited, but after the attacks Telegram banned 78 Islamic State channels and created an “ISIS Watch” bot account to record ongoing deletions. The seriousness the company tried to portray in the wake of the attacks was arguably undermined by Durov himself, however, when he posted a picture of his face photoshopped onto an image of a gun-wielding Islamist terrorist via his VK profile; this image is still the background image for Durov’s YouTube channel.
In 2017, the Indonesian government threatened to block access to Telegram due to its alleged use by Islamist terrorists. In response, Telegram belatedly pledged to establish moderation protocols to prevent this. The app was nevertheless banned in Indonesia – Telegram’s promises were too little, too late. In August that year, Durov flew to Indonesia and met with government ministers to secure the lifting of the ban. He appeared in a public press conference with the minister for communications and information technology, and apologised for Telegram’s failure to respond to government requests, while pledging to stamp out illegal content. It was a rare example of Durov appearing in person to show cooperation with authorities; it hasn’t occurred since.
By 2018, Telegram was still being criticised as a harbour for ISIS terrorists—UK prime minister Theresa May dedicated a section of her speech at Davos that year to criticising Telegram on that count, as well as for harbouring “criminals” and “paedophiles”. Telegram did begin to take greater steps towards cooperating with authorities and international law enforcement over ISIS’s use of the platform, and began coordinating with Europol.
The same year, Russia’s internet and telecoms regulator Roskomnadzor banned Telegram in the country for failing to cooperate in supposed cases of terrorism and extremism. Despite the ban, Telegram’s team ensured that the app remained in service for millions of Russians and the attempt to ban it was largely a failure; it seems the company has not expended the same efforts in other jurisdictions where Telegram was banned, for example in Iran, for reasons unknown. In 2020 Telegram was unbanned in Russia, with Roskomnadzor commending “the readiness expressed by Telegram’s founder to combat terrorism and extremism.” It is unclear what Telegram agreed to, if anything, to secure the lifting of the ban.
Durov often hails Telegram’s use by pro-democracy activists and underlines his company’s stated commitment to not cooperate with authoritarian governments. In the 2019-20 protests in Hong Kong, Telegram is not known to have cooperated with any government requests; the company even allowed users to cloak their telephone numbers to shield them from monitoring by the authorities. In the 2020-21 protests against the Russian-backed government in Belarus, Telegram was also used widely by pro-democracy activists and is not known to have cooperated with the authorities.
During the coronavirus pandemic, Telegram cooperated with governments across the globe to help disseminate official public health information. But the platform also became a hotbed of Covid deniers, antivaxxers and conspiracy theorists. The company does not believe it should be responsible for people expressing alternative views about Covid-19. While other platforms widely removed Covid disinformation, little seems to have been to stop it spreading on Telegram, which might be one reason why Covid-sceptic movements tended to favour the platform in several countries and use it to organise.
This is precisely what happened in Germany during the pandemic, as described elsewhere in this report. With extremist and far-right activity hotting up on Telegram through 2021, German authorities threatened the company with fines and even a potential banning for failing to take down illegal content. Telegram finally began cooperating with the authorities in 2022—Durov even reportedly appeared via video call with federal law enforcement. According to Der Spiegel, Telegram subsequently provided the federal police in Germany with personal data of users suspected of terrorism, as well as child abuse.
India is another jurisdiction where Telegram receives periodic requests from the authorities—it is also home to Telegram’s largest user base, with 100 million users. In November 2022, the company provided a high court in India with the names, phone numbers, and IP addresses of users who were accused of illegally sharing a teachers’ copyrighted course materials via Telegram.
In Brazil, ahead of the 2022 presidential election, Telegram was widely used by far-right groups supportive of the incumbent President Jair Bolsonaro, who in echoes of his ally in the US President Trump, had failed to promise a peaceful transition of power if he lost the election. In March 2022, after failing to receive any response from Telegram to its requests, Brazil’s supreme court ordered the suspension of the app. In echoes of the Indonesian case, Telegram did then finally react: Durov blamed the problem on email shortcomings at the company, adding via his public Telegram channel: “I am certain that once a reliable channel of communication is established, we’ll be able to efficiently process takedown requests for public channels that are illegal in Brazil.” After communications were established, the supreme court in Brazil revoked the ban. In January 2023, in the wake of Bolsonaro’s defeat in the election, a mob of violent supporters stormed the presidential palace; many of the insurrectionists reportedly organised on Telegram, which has subsequently been fined by the authorities.
As can be seen, Telegram’s response to authorities’ requests varies from country to country. There is a fundamental tension between the company’s stated defence of user privacy and security, and the need to remove illegal content on the platform as requested by authorities in order to continue to operate in different jurisdictions. Digital researchers and civil groups who study Telegram believe that the company only responds as a last resort—the cases of Brazil and Germany are strong examples of this.
Telegram’s approach analysed
To reiterate, there are two points to note here about how Telegram handles requests from the authorities with regards to extremist content on the platform. Firstly, the company claims to have a commitment to free speech and user privacy, but in reality its position seems to be based more on realpolitik—it tends to employ a case-by-case approach rather than fall back on a generalised policy. For example, Telegram is adamant that it does not cooperate with the authorities in Russia and that it has never shared user data there (although civil groups and opposition activists have expressed concerns about the lack of transparency about how it responds to requests from the Russian authorities.), but in Germany, Telegram has reportedly shared user data in cases of suspected terrorism; there seems to be no overriding policy on how to respond to such requests. Secondly, the company tends to respond only under extensive legal and public pressure from the authorities; and it tends to respond more robustly in nations with a large number of Telegram users.
Why does Telegram have this seemingly variegated response to requests from authorities, extremist activity and the reporting of illegal content on the platform? One reason could be that the app’s role in different countries requires it to strike a delicate balance. Easy cooperation with the authorities in one country might leave it open to accusations that it will cooperate with the authorities in another—including Russia, where Telegram has an uneasy, if ambiguous, position in relation to the government. Russian activists might point to enthusiastic cooperation with the authorities in Germany as a sign that it will cooperate anywhere, for example.
A more practical possible explanation is lack of resources. Telegram claims to have about 60 employees—a tiny staffing compared to Meta. Even Twitter, whose staff was recently culled after Elon Musk’s takeover, has an estimated 1,300 active employees. (Caveat: these are all different kinds of messaging and social-networking platforms, though each one bears similarities to Telegram.) Telegram does have a moderation team, but digital researchers say reporting illegal content rarely receives a quick response, if any. Russian opposition activists have resorted to mass reporting to push Telegram to remove illegal or dangerous content.
Why doesn’t Telegram employ more moderation staff? One reason could be that the vastly staffed moderation apparatus of Meta’s platforms is expensive to run, and Telegram has no reliable source of financing: Telegram is a private company, earning only a small amount of ad revenue, and is free to use despite the recent launch of a Premium service. It is possible that Telegram simply does not have the resources to tackle illegal content more systematically.
However, Telegram’s core developers are regarded by ex-Telegram staff as among the most skilled in the industry. It is plausible that they could come up with technical solutions to automatically weed out more illegal content than is currently the case. Although the feasibility of this cannot be independently verified, senior Telegram staff have implied another reason why it might not implement such a solution: the company does not think it is its responsibility to police what users say or how they utilise the platform.
Indeed, Telegram’s sluggishness to comply with requests from authorities must be seen in the context of Durov’s free-speech advocacy and avowed defence of user privacy. Durov makes no secret of his devout libertarianism going back years. Although Telegram has cooperated with the authorities in several jurisdictions after coming under pressure, there is a sense from the company’s and Durov’s ethos that it should not be censoring people’s opinions, however extreme they may seem, unless they clearly break the law.
There could also be a business reason for not more robustly tackling extreme and illegal content. Burnishing Telegram’s reputation internationally as the platform where almost anything goes helps to increase its user base among all kinds of segments of society. Ultimately, Durov may see that walking a fine line over these issues is simply what works best for business.
Who Runs Telegram?
Between Ideology and Pragmatism as a Core Element of Corporate Strategy.
Telegram has not been a profitable platform for founder Pavel Durov. Attempts to earn money via the messaging service, e.g. through advertising, have largely failed so far.
However, in recent months, payment features on the platform have been significantly expanded and new capabilities have been developed to support both Telegram and channel operators.
The introduction of Payment 2.0 and the expansion of the bots @wallet and @donate provided by Telegram suggest that the platform will integrate payment and donation functions into the app to an even greater extent in the future. This is also indicated by the latest extensions with the function to buy and auction usernames via Telegram. According to Pavel Durov, this was very successful: within a few weeks, more than 50 million US dollars had been transacted via the Telegram auction platform Fragment. In contrast to previous monetization attempts (for example, through advertising on the platform or premium features), success was seen here for the first time with respect to the monetization of Telegram, which had previously not been very profitable.
Telegram’s next step is to build a set of decentralized tools, including non-custodial wallets and decentralized exchanges for millions of people to securely trade and store cryptocurrencies.
With the expansion of payment options on the platform, it is to be feared that even more actors will be able to professionalize themselves in the future. In the long term, Telegram will stabilize the conspiracy ideology and far-right milieu, network them internationally, and enable them to raise funds via the platform to an even greater extent.
In just a few years, Telegram has become the most important platform for conspiracy ideologues and right-wing extremists. Not just in Germany, but around the world, disinformation and conspiracy narratives are shared via Telegram. Funding opportunities are now being exploited over the platform. Right-wing terrorist groups also use the platform to network internationally. Since Telegram is currently working on further expanding direct payment options on the platform, it is to be feared that even more actors will professionalize themselves over the platform in the future. In order to counteract this development and encourage platform operators to take action, there are some promising approaches:
Public pressure causes Telegram to act.
Telegram is concerned for its reputation and fears being removed from the app stores of Google and Apple above all. Public reporting on the platform’s misconduct as well as political pressure has historically often led Telegram to act – albeit unsystematically.
Even without the platform’s cooperation, investigations and prosecutions are often possible.
The arrests and raids within Germany in the past months as well as numerous investigative inquiries have shown that identifying perpetrators is still possible even in the absence of cooperation with the platforms. This also requires greater awareness among (investigative) authorities of the potential threat posed by digital hate and digital violence.
Demonetization can be an important means of limiting disinformation and conspiracy ideology.
The increasing professionalization of the conspiracy-ideological and far-right scene combined with the possibility of funding via platforms like Telegram present us with significant challenges. More resources enable more action. Since Telegram has hardly and only haphazardly made use of blocking and deletion (so-called “deplatforming”), it is also necessary to consider measures that prevent monetary enrichment from, as well as financing of, disinformation and conspiracy ideology, for example by informing advertisers and payment service providers on the platforms.
Content moderation must be transparent.
Telegram makes highly arbitrary use of blocks and restrictions. The reporting of channels and content on the platform has no discernible impact. Transparency reporting and external scrutiny, including via legislation such as the Digital Services Act, must be enforced.
CeMAS
Center for Monitoring, Analysis and Strategy
CeMAS consolidates interdisciplinary expertise on the topics of conspiracy ideologies, disinformation, antisemitism, and right-wing extremism.
This report has been funded by the Alfred Landecker Foundation. The Alfred Landecker Foundation promotes and advances the development of an open, democratic, and discrimination-free society. The foundation is an incubator for democracy in the digital age. It advocates for a contemporary culture of remembrance and fights against antisemitism and racism. The foundation generates networks, spaces and knowledge by supporting, promoting, networking and professionalizing interdisciplinary projects. By building a network of globally-active partners, it makes knowledge and experience widely available and brings together diverse perspectives from academia and professional practice.