Contacts
1207 Delaware Avenue, Suite 1228 Wilmington, DE 19806
Let's discuss your project
Close
Business Address:

1207 Delaware Avenue, Suite 1228 Wilmington, DE 19806 United States

4048 Rue Jean-Talon O, Montréal, QC H4P 1V5, Canada

622 Atlantic Avenue, Geneva, Switzerland

456 Avenue, Boulevard de l’unité, Douala, Cameroon

contact@axis-intelligence.com

Business Address: 1207 Delaware Avenue, Suite 1228 Wilmington, DE 19806

Is Telegram Safe 2026? Privacy Analysis

Is Telegram Safe in 2026? Privacy Analysis + Exclusive Data The only guide with the TPRS framework, the exclusive law enforcement disclosure inflection point data, and a 6-profile User Risk Matrix. Updated

Is Telegram Safe 2026?

Updated May 2026

Quick Answer: Telegram is safe enough for casual group conversations, community channels, and everyday social messaging — with deliberate configuration. It is not safe, by default, for sensitive communications. The core problem is architectural, not cosmetic: default chats are not end-to-end encrypted, meaning Telegram’s servers can access message content. Beyond encryption, 2024 produced a seismic shift in Telegram’s relationship with law enforcement that permanently changed the privacy calculus: a platform that fulfilled just 14 US law enforcement data requests in the first nine months of 2024 completed 900 by year-end. That’s not a privacy-focused platform behaving the same way. That’s a fundamentally different platform making fundamentally different choices. This guide quantifies exactly what changed, who is affected, and what it means for your specific use case.


Why “Is Telegram Safe?” Has a Different Answer in 2026 Than in 2023

Three events between August 2024 and November 2025 permanently altered what “safe” means in the context of Telegram:

1. Pavel Durov’s arrest and indictment (August 2024). Telegram’s CEO was detained at Paris–Le Bourget Airport on August 24, 2024, and formally indicted four days later on twelve charges including complicity in administering a platform facilitating illicit transactions by an organized group, complicity in the distribution of child sexual abuse material, and refusal to cooperate with lawful data requests. The maximum penalty for the primary charge: 10 years imprisonment and a €500,000 fine. France’s travel ban on Durov was fully lifted in November 2025, but the investigation remains ongoing.

2. A complete reversal on law enforcement cooperation (September 2024). In direct response to Durov’s arrest, Telegram amended its privacy policy in September 2024 — expanding the categories of legal requests it would honor from terrorism-only to crimes broadly. The quantitative consequence of that policy change is documented in Telegram’s own transparency reports and is the most important data point in this article.

3. Academic confirmation of structural MTProto vulnerabilities (ongoing). Research published at IEEE Security & Privacy 2022, with follow-up analysis through 2025, formally identified vulnerabilities in Telegram’s proprietary MTProto 2.0 encryption protocol — including message reordering attacks, key exchange authentication weaknesses, and implementation flaws that could theoretically enable plaintext recovery. These vulnerabilities have largely been patched, but their discovery confirmed what cryptographers had long argued: building a proprietary protocol instead of adopting industry standards like TLS is a gamble that produces unnecessary risk.

Every “Is Telegram Safe?” article published before September 2024 is now partially obsolete. Any article that doesn’t address the law enforcement data disclosure shift is telling you about a different platform than the one that exists today.

The Telegram Privacy Reality Score (TPRS): The Framework No Competitor Provides

Every competing article in this category answers “is Telegram safe?” with a paragraph. None of them produce a structured, multi-dimensional assessment that allows comparison across platforms or across time. The TPRS (Telegram Privacy Reality Score) fills that gap.

The TPRS evaluates Telegram across six dimensions, each scored 0–20, for a total of 100. A score of 100 represents theoretical maximum privacy. For reference, the estimated TPRS for Signal is 87/100; for WhatsApp, 61/100.

TPRS Dimension 1: Encryption Architecture (Score: 7/20)

The most consequential dimension. Full marks require end-to-end encryption by default for all communication types. Telegram scores poorly here — not because it lacks E2EE capability, but because it reserves that capability for a feature most users never enable.

Standard cloud chats use MTProto client-server encryption: messages are encrypted between your device and Telegram’s servers, decrypted at the server, and re-encrypted for delivery. Telegram can read cloud chat content if compelled or compromised. Secret Chats use MTProto client-client encryption — genuine E2EE. The problem: Secret Chats must be initiated manually, don’t sync across devices, and are unavailable for group chats, which is where the majority of Telegram’s communication occurs.

No E2EE by default: −8 points. No group E2EE at all: −5 points. Secret Chat implementation is functional: +7 points. Dimension score: 7/20.

TPRS Dimension 2: Metadata Minimization (Score: 8/20)

What Telegram collects, per its own privacy policy: phone number (required for account creation, cannot be changed without re-registration), IP address (stored temporarily for connection purposes, but “temporarily” is not defined with precision), session information (active sessions, device types, login timestamps), and basic feature usage analytics. What Telegram explicitly does not collect: behavioral profiles, ad targeting data, or message content for advertising purposes.

The phone number requirement is the primary metadata vulnerability. A phone number is a persistent, real-world identifier that connects your Telegram presence to your mobile carrier, your SIM registration, and potentially your physical identity. SIM-swap attacks remain one of the leading vectors for Telegram account compromise precisely because phone numbers are both required for authentication and difficult to truly anonymize.

IP address logging creates geolocation exposure. Even “temporary” IP storage means that at the moment of a valid legal request, Telegram can provide your IP address — which, combined with ISP records, identifies physical location with high precision. Dimension score: 8/20.

TPRS Dimension 3: Law Enforcement Cooperation (Score: 9/20)

This dimension has changed more dramatically than any other since 2023. The quantitative data is drawn directly from Telegram’s official transparency reports, accessible via Telegram’s dedicated transparency bot.

Before September 2024: Telegram disclosed user data (IP addresses and phone numbers) only in terrorism cases. The platform’s stated policy was effectively zero cooperation on all other categories of crime.

After September 2024: Telegram expanded its disclosure policy to cover any valid legal request related to criminal activity. The enforcement data: between January 1 and September 30, 2024, the United States received fulfilled data requests for 14 legal cases affecting 108 users. Between October and December 2024 alone, total US fulfilled requests for the year reached 900 cases affecting 2,253 users.

The cooperation now exists. The platform’s previous privacy guarantees in this dimension no longer hold. Dimension score: 9/20 — partial credit for maintaining a transparency report and for the fact that disclosed data is limited to IP and phone number, not message content.

TPRS Dimension 4: Code and Protocol Transparency (Score: 10/20)

Telegram’s client applications — Android, iOS, desktop — are open source and available for review. This is a meaningful positive: independent researchers can verify that client-side implementations behave as documented. Telegram also publishes the MTProto protocol specification in detail, enabling cryptographic analysis.

The significant negative: Telegram’s server-side code is proprietary and closed. This is the critical gap. The client can be verified; what the server does with data that passes through it cannot be independently confirmed. All cloud chat messages pass through Telegram’s servers in a form the servers can access. The trustworthiness of that server-side handling is a matter of faith in Telegram’s promises, not verifiable fact.

This is not a uniquely Telegram problem — most messaging platforms have proprietary servers. But it’s a meaningful constraint for anyone evaluating Telegram as a privacy tool rather than a convenience tool. Dimension score: 10/20.

TPRS Dimension 5: Infrastructure Accountability (Score: 6/20)

This is the dimension that receives the least attention in competing articles and carries some of the most serious implications.

An investigation published by iStories (an independent Russian investigative outlet) and the Organized Crime and Corruption Reporting Project found that key components of Telegram’s network infrastructure — including IP address ranges and router-level access at certain data centers — were operated by companies with ownership links to individuals who maintained long-standing business relationships with Russian state institutions, including the Federal Security Service (FSB).

The specific figure identified: Vladimir Vedeneev, a network engineer whose company maintains Telegram’s networking equipment and assigns thousands of its IP addresses, with no public profile despite this extraordinary level of access to infrastructure carrying billions of messages.

Telegram disputes characterizations of Russian state influence and maintains its servers are distributed globally with no access by Russian authorities. What is not disputed: Telegram was founded by Russians who previously built Russia’s largest social network (VK), left Russia under state pressure over censorship demands, and has maintained opacity about its infrastructure governance that no major Western platform would be permitted to maintain.

The MTProto protocol specification also contains an unencrypted element — the auth_key_id — attached to the beginning of every encrypted message. This header is not encrypted and is visible to anyone with network access, enabling traffic pattern analysis even when message content itself is encrypted. Dimension score: 6/20.

TPRS Dimension 6: Account Security Defaults (Score: 8/20)

Two-factor authentication is available but disabled by default. The vast majority of Telegram’s approximately 950 million monthly active users have never enabled it. Phone number-based authentication (one-time SMS codes) is vulnerable to SIM-swap attacks — a well-documented compromise vector that has been used to take over high-profile accounts across multiple platforms.

Telegram has partially addressed this: if 2FA is enabled, an attacker who successfully SIM-swaps must still know the password. If 2FA is not enabled, a SIM-swap produces complete account access. Given the default-off status, the practical protection rate is low.

Active session management — viewing and terminating sessions across devices — is functional and accessible. Self-destructing messages work as documented. The “People Nearby” feature, which researchers demonstrated could reveal user location within tens of meters, was removed from the platform in response to criticism. Dimension score: 8/20.

Complete TPRS Summary

DimensionScoreMaxNotes
Encryption Architecture720No default E2EE; no group E2EE; Secret Chats functional but opt-in
Metadata Minimization820Phone number required; IP logged; no behavioral profiling
Law Enforcement Cooperation920Post-September 2024 policy: broad disclosure on valid legal requests
Code/Protocol Transparency1020Client open source; server closed source
Infrastructure Accountability620Russian infrastructure connections; MTProto auth_key_id unencrypted
Account Security Defaults8202FA off by default; phone number as primary ID
Total TPRS48/100100“Conditionally Safe”

For comparison: Signal scores approximately 87/100 on this framework (full E2EE by default including groups, minimal metadata, strong law enforcement resistance due to technical design, open-source protocol). WhatsApp scores approximately 61/100 (default E2EE including groups, but extensive Meta metadata collection and behavioral profiling). Telegram’s 48/100 reflects genuine security capabilities combined with structural defaults and policy choices that undermine the privacy brand.


MTProto 2.0: What the Academic Research Actually Found

Telegram’s encryption is built on MTProto — a proprietary protocol developed by Nikolai Durov (Pavel’s brother) rather than an established standard like TLS or the Signal Protocol. The cryptographic community has viewed this choice skeptically since the protocol’s inception, and the peer-reviewed research confirms those concerns were not merely theoretical.

The most rigorous published analysis is the work by Martin Albrecht, Lenka Mareková, Kenneth Paterson, and Igors Stepanovs — researchers from Royal Holloway, University of London, and ETH Zürich — published at the IEEE Symposium on Security and Privacy 2022 and extended in subsequent analysis. Their findings, disclosed to Telegram’s development team in April 2021 before public release in July 2021:

Vulnerability 1 — Message reordering. An attacker with network position between a client and Telegram’s servers could reorder messages sent from the client to the server. In isolation, this attack doesn’t recover message content, but it demonstrates that MTProto’s design didn’t achieve the cryptographic security properties it should have.

Vulnerability 2 — Plaintext recovery (theoretical, mitigated by coincidence). The researchers identified that three official Telegram clients — Android, iOS, and Desktop — contained implementation code that, under carefully constructed attack conditions, could permit recovery of some plaintext from encrypted messages. An attacker would need to send millions of carefully crafted messages to the target, making the attack impractical. The researchers noted: “Luckily, it is almost impossible to carry out in practice. In particular, it is mostly mitigated by the coincidence that certain metadata in Telegram is chosen randomly and kept secret.” The phrase “mitigated by coincidence” is not a phrase that should appear in security documentation. It means the protection is accidental rather than designed.

Vulnerability 3 — Key exchange authentication weakness. The most serious finding: researchers were able to chain the message-reordering attack with an attack against the key exchange protocol implementation on Telegram’s servers, breaking the authentication properties of the key exchange. This would allow a man-in-the-middle attack under certain conditions. This vulnerability required access to Telegram’s server infrastructure to execute — a high bar — but its existence demonstrated that the key exchange’s security guarantees were not mathematically sound as implemented.

Telegram patched all disclosed vulnerabilities following responsible disclosure. The researchers subsequently confirmed that a “slight variant” of MTProto can provide security comparable to TLS — but only under “unusual assumptions about the building blocks of the protocol” that don’t apply to standard TLS. That qualification matters: it means MTProto’s security proof requires more exotic cryptographic assumptions than TLS requires, which means the security guarantee is weaker in a formal sense.

The unresolved concern is not the patched vulnerabilities — it’s the architecture that produced them. MTProto is a non-standard protocol without the decades of peer review that TLS and the Signal Protocol have accumulated. Every implementation of a custom protocol by third-party Telegram clients (which are permitted under Telegram’s open-source policy) is a fresh opportunity for implementation errors, because the protocol’s design “puts significant burden on developers who implement it,” per the researchers’ own assessment.

What this means practically: For the overwhelming majority of Telegram’s approximately 950 million monthly active users, the MTProto vulnerabilities discovered and patched in 2021 are not an active threat. The more relevant threat model for ordinary users is not sophisticated cryptographic attack — it’s the default lack of E2EE, the law enforcement data disclosure policy, and account-level attacks like SIM-swapping.

The Encryption Gap: What “Encrypted” Actually Means on Telegram

Telegram’s marketing describes the platform as offering “heavily encrypted” messaging. This is technically accurate and contextually misleading at the same time — a distinction that matters enormously for users making security decisions based on it.

Cloud Chats: What You Get By Default

Every regular Telegram chat — personal messages, group chats, channel interactions — uses client-server encryption. Your device encrypts the message before sending it to Telegram’s servers. Telegram’s servers decrypt it to process delivery, store it in the cloud (enabling multi-device sync), and re-encrypt it for delivery to the recipient’s device.

The practical implication: Telegram can read your default chat messages. Not necessarily in an automated way, not necessarily routinely, but structurally: the keys to decrypt cloud chat messages are held by Telegram, not exclusively by you and your recipient. This is precisely what “not end-to-end encrypted” means.

This design choice is deliberate and produces a real benefit: cloud chats sync across all your devices instantly. You can switch from your phone to your laptop to a tablet and see the same conversation history without any setup. Signal, which is E2EE by default, cannot offer this — if your phone is the only device, your messages exist only on that phone.

The tradeoff is a reasonable engineering decision for a product prioritizing convenience. The problem is that Telegram’s positioning as a “privacy-focused” and “secure” platform causes users to assume their messages are more protected than they are. Research consistently confirms this: a substantial proportion of Telegram users in high-risk environments (journalists, activists) use cloud chats for sensitive communications while believing they are protected by E2EE. They are not.

Secret Chats: What End-to-End Encryption Actually Looks Like

Secret Chats use client-client MTProto encryption — genuine E2EE. The encryption keys are generated on your device and your recipient’s device; they are not sent to Telegram’s servers. Telegram cannot read Secret Chat messages in transit or at rest. Secret Chats support self-destructing message timers, protection against message forwarding, and deletion that removes messages from both devices simultaneously.

The limitations: Secret Chats are device-specific and do not sync across your other devices. Starting a Secret Chat on your phone creates a conversation that does not appear on your laptop. If you reinstall the app or switch phones, the Secret Chat history is gone. Group chats cannot use Secret Chats — there is no group E2EE option in Telegram at any setting.

The Encryption Comparison Matrix

FeatureTelegram Cloud ChatTelegram Secret ChatSignalWhatsApp
E2EE by default❌ No✅ Yes✅ Yes✅ Yes
Group E2EE❌ No❌ N/A✅ Yes✅ Yes
Multi-device sync✅ Yes❌ No✅ Yes (limited)✅ Yes
ProtocolMTProto 2.0MTProto 2.0 (E2E)Signal ProtocolSignal Protocol
Protocol audited?✅ Partially✅ Partially✅ Extensively✅ (same protocol)
Server-side code open?❌ No❌ No❌ No❌ No
Message storage locationTelegram cloud serversDevice onlyDevice onlyBackup optional
Can provider read messages?✅ Yes (technically)❌ No❌ No❌ No
Disappearing messages✅ Optional✅ Optional✅ Optional✅ Optional

The Electronic Frontier Foundation’s Surveillance Self-Defense guide explicitly recommends Signal for users in high-risk situations and notes that Telegram’s default settings do not provide the same level of protection as messaging apps that use E2EE by default.


The September 2024 Inflection Point: The Data Story Nobody Has Assembled

This section presents data drawn directly from Telegram’s own transparency reporting — accessible through Telegram’s official transparency bot — and contextualizes it in a structured comparison that no competing article has published. It is the most consequential section of this guide for anyone who used Telegram’s pre-2024 reputation as a law enforcement non-cooperator to make privacy decisions.

The Policy Before and After

For most of its operating history, Telegram maintained a policy of disclosing user data — specifically IP addresses and phone numbers — only in response to legal requests related to terrorism. The platform repeatedly cited this policy as evidence of its commitment to user privacy. Pavel Durov publicly characterized Telegram as providing zero user data to governments, and this positioning attracted millions of users specifically seeking to communicate outside law enforcement visibility.

In September 2024, one month after Durov’s arrest in France, Telegram amended its privacy policy. The amendment: Telegram will now disclose IP addresses and phone numbers in response to valid legal requests related to criminal activity broadly — not limited to terrorism. This is a categorical expansion. “Criminal activity” encompasses the full spectrum of legal systems’ definitions of crime across every jurisdiction where Telegram’s transparency bot reports compliance.

The Quantitative Before-and-After: The Exclusive Data Table

The following data is constructed from Telegram’s official transparency reports. Because Telegram’s transparency bot provides region-specific data accessible only with a Telegram account, and because Telegram does not publish a consolidated global report in standard format, this comparative table does not exist anywhere else in consolidated form.

United States Law Enforcement Data Disclosures — Telegram Transparency Data:

PeriodFulfilled RequestsAffected UsersPolicy Status
January–September 202414108Terrorism-only policy
October–December 2024 (post-Durov arrest)Spike to year-end total of 900Total year-end 2,253Broad criminal cooperation policy
Full Year 20249002,253Transition year
Implied Q4 2024 alone~886~2,145New policy operational

The Q4 2024 acceleration is the data point. In approximately 90 days following the policy change, Telegram fulfilled roughly 63× more US law enforcement requests than it had in the preceding nine months.

Global Snapshot — 2024 Full Year Transparency Data:

Country2024 Fulfilled Requests2024 Users AffectedNotable Context
United States9002,25364× increase vs. Jan–Sep 2024
India14,64123,535Largest disclosed volume globally
United Kingdom142293Sharp increase from near-zero
BrazilOngoing (75 Q1, 63 Q2, 65 Q3)Not fully disclosedPre-existing cooperation
European statesVariableVariableGenerally increased post-September

India’s figure is striking: 14,641 fulfilled requests affecting 23,535 users in a single year suggests systematic law enforcement use of Telegram’s compliance mechanism at a scale that reflects institutional incorporation into investigative workflows — not exceptional cases.

What This Data Means for Privacy Analysis

Three conclusions are supported by this data:

1. The “Telegram doesn’t cooperate with governments” narrative is false as of September 2024. The previous reputation was accurate for a specific historical period with a specific policy. That policy no longer exists. Articles citing Telegram’s historical non-cooperation as a current privacy feature are misleading readers.

2. The disclosed data — IP addresses and phone numbers — is sufficient for high-consequence identification. An IP address combined with an ISP subpoena identifies a physical internet connection and, in most configurations, a household or workplace. A phone number combined with carrier records identifies a real-world person, their address, and their account history. For ordinary users, this is unlikely to be consequential. For activists, journalists, or dissidents operating in jurisdictions where political activity is criminalized, a phone number and IP address disclosure can be life-threatening.

3. The policy change is permanent, not temporary. Durov’s travel ban was lifted in November 2025; the investigation continues; but Telegram has not rolled back its September 2024 policy expansion. The platform that emerged from the Durov arrest is structurally different from the one that predated it.

The Cybercrime Community’s Response

One measure of how significantly Telegram’s privacy profile changed: cybercriminal communities that had explicitly selected Telegram for its law enforcement resistance publicly announced their intent to migrate to alternative platforms immediately following the September 2024 policy change. Dark web forums and threat actor channels documented discussions about switching to alternatives. This response — from people whose professional survival depended on accurate assessment of the platform’s actual cooperation posture — is itself a data point about how consequentially the policy changed.

Infrastructure Accountability: The Question Nobody Is Asking

Every major messaging platform faces questions about government access. What makes Telegram’s infrastructure situation distinctive is a set of documented findings that go beyond typical government-pressure concerns.

An investigative report by iStories — an independent Russian investigative outlet operating under significant constraints — and corroborated by reporting from the Organized Crime and Corruption Reporting Project, identified that key components of Telegram’s network infrastructure were maintained by entities with documented connections to individuals who have long-standing business relationships with Russian state institutions.

The central figure documented: Vladimir Vedeneev, described as a network engineer with no public profile who controls the company that maintains Telegram’s networking equipment and assigns thousands of its IP addresses. The investigation identified that Vedeneev’s company occupied a position in Telegram’s infrastructure that would conventionally be held by a major infrastructure vendor with transparent ownership, public contracts, and auditable relationships. Instead, it was held by a company with opaque ownership and documented ties to individuals connected to Russian state structures.

Telegram’s response is that it maintains no connection to Russian state intelligence and that its servers are distributed globally with no access provided to Russian authorities. This may be true at the application layer — Telegram may genuinely provide no user data to Russian intelligence. The infrastructure concern operates at a different layer: if the routing infrastructure itself is controlled by entities with FSB relationships, message metadata (who communicates with whom, at what times, from which IP ranges) could theoretically be accessible at the network level regardless of application-layer policies.

The MTProto protocol’s auth_key_id — an unencrypted element attached to the beginning of each encrypted message — is relevant here. This header is not encrypted and is visible to anyone with access to the network layer. Across billions of messages, this metadata enables communication graph analysis: mapping who talks to whom, at what frequency, from which locations. This analysis is possible without ever decrypting message content.

This concern falls into the category of “not proven, not disprovable.” No independent audit of Telegram’s infrastructure governance has been conducted and published. Given the scale of Telegram’s user base and the sensitivity of communications it carries, the absence of such an audit is itself a notable accountability gap.

What Telegram Actually Collects: The Complete Data Inventory

Telegram’s privacy policy describes its data collection more precisely than most major platforms. This section presents that data in structured form, because “Telegram doesn’t sell your data” — a statement that appears in most competing articles — addresses only one dimension of a more complete data landscape.

Data Telegram Explicitly Collects

Account data:

  • Phone number (required, permanent account identifier)
  • Name and username (required)
  • Profile photo (if uploaded)
  • Biography (optional)
  • Contact list (optional, if you grant access)

Technical data:

  • IP address (logged at connection time; retention period defined as “temporary” without specific duration commitment)
  • Device type and operating system
  • Active sessions: all devices currently logged in, with location approximations and last activity timestamps
  • Login timestamps and authentication history

Usage data:

  • Basic feature usage analytics (described as anonymized)
  • Message delivery status (sent/received/read indicators)

Communication metadata:

  • For cloud chats: who you communicate with, when, how frequently, and at what message length (metadata that persists on Telegram’s servers even if individual messages are deleted by the user)

Data Telegram Explicitly Does Not Collect

  • No behavioral profiles. Telegram does not profile users for behavioral advertising.
  • No ad targeting data. Telegram’s advertising model (Sponsored Messages in public channels) targets by topic/channel category, not by individual user behavior.
  • No cross-platform tracking. Telegram does not embed tracking pixels or cross-site tracking mechanisms.
  • No message content for advertising. Cloud chat message content is not used for ad targeting.

The Gap Between “What Telegram Says” and “What’s Verifiable”

Telegram’s server-side code is not open source. The commitments above — particularly regarding retention periods for IP addresses and metadata — cannot be independently verified. They represent Telegram’s stated policy, not an auditable technical fact. This is the same gap that exists with most major platforms; it is worth naming explicitly because Telegram’s privacy brand creates an expectation of verifiability that its server architecture doesn’t support.

NIST’s Privacy Framework, specifically its “Communicate-P” and “Protect-P” functions, provides a reference standard for what an organization with genuine privacy commitment demonstrates: not just privacy policies, but verifiable technical controls, independent audits, and transparent governance. Telegram meets some of these criteria and falls short on others, specifically third-party infrastructure auditing and server-side code transparency.

Understanding where Telegram’s leadership and legal situation stands in May 2026 is relevant for any long-term platform decision.

DateEventPrivacy Implication
February 2024French judicial investigation opened (JUNALCO — Paris Cybercrime Unit)Investigation predates arrest; Telegram’s non-cooperation was documented prior to any public event
August 24, 2024Pavel Durov arrested at Le Bourget Airport, ParisFirst major enforcement action against a major messaging platform CEO for moderation failure
August 28, 2024Durov indicted on 12 charges; €5M bail posted; banned from leaving FranceFormal legal proceedings began; initial charges documented
September 2024Telegram amends privacy policy to expand law enforcement cooperationDirect causal link to Durov arrest; most consequential change for user privacy
December 2024Durov acknowledges “growing criminal presence on the platform” during questioning; pledges content oversightFirst public acknowledgment of platform’s criminal use problem
March 2025Investigating judge allows Durov to temporarily leave FrancePartial travel restrictions relaxed; investigation continues
July 2025Durov permitted to leave France for up to two weeks at a timeFurther relaxation; legal proceedings ongoing
November 13, 2025France fully lifts travel ban on DurovTravel restrictions ended; formal investigation remains open
May 2026Investigation ongoing; no trial date setLegal uncertainty persists; Telegram’s legal exposure not resolved

The investigation remaining open in May 2026 means Telegram continues to operate under legal pressure that directly influenced the September 2024 policy change. Whether that policy would revert under different circumstances is unknowable, but the incentive structure that produced it has not been removed.

One financial note relevant to platform stability: Durov announced in late 2024 that Telegram had achieved profitability for the first time in its history, with total revenue exceeding $1 billion. This is materially positive for platform continuity — a profitable Telegram is less likely to face existential pressure than one burning through reserves.

The Russian Infrastructure Question: Signal Intelligence Risk Assessment

The most sensitive element of Telegram’s situation — and the one that most directly affects users in specific threat environments — is the Russian state intelligence dimension.

Telegram was founded by Pavel Durov after he lost control of VK (Russia’s largest social network) to Kremlin-aligned investors following his refusal to hand over user data for Ukrainian protesters. The narrative of Telegram’s founding is explicitly one of escape from Russian state control. This origin story has been central to Telegram’s privacy brand.

The complicating factor: the infrastructure findings from iStories and OCCRP suggest that some of the networks carrying Telegram’s traffic may be managed by entities with FSB connections, regardless of Telegram’s application-layer policies. If accurate, this means Russian signals intelligence may have access to metadata patterns (communication graphs, timing data, IP correlations) that Telegram’s own privacy policy doesn’t address — because it operates below the application layer.

For most of Telegram’s 950 million users, this concern is not operationally relevant. Russian intelligence services are not monitoring ordinary conversations about family plans, sports, or entertainment. But for specific populations — Ukrainian activists and journalists, Russian dissidents and anti-war organizers, Belarusian opposition figures, anyone whose communications would be of interest to Russian state intelligence — this infrastructure uncertainty represents a serious threat model consideration that has no analog in Signal, WhatsApp, or iMessage.

CISA’s guidance on secure communications consistently emphasizes that platform selection for sensitive communications should account for the full threat model, including state-level adversaries, not only commercial data collection or standard law enforcement cooperation. Telegram’s infrastructure uncertainty is precisely the kind of risk factor this guidance addresses.

The User Risk Segmentation Matrix: The Framework That Changes the Answer

Every article in this category answers “is Telegram safe?” with a single verdict. That’s the wrong frame for the question. Telegram’s safety profile is radically different depending on who’s using it, for what purpose, and in which threat environment. The User Risk Segmentation Matrix maps six user profiles to specific risk levels, threat models, and actionable guidance.

This is the framework that transforms a generic “yes, with caveats” answer into something practitioners — journalists, HR directors, security teams, activists — can actually act on.

Profile 1: The Casual User (Family, Friends, Community Groups)

Threat model: Data breach, account takeover, spam/scam exposure. No state adversary concern; no legal exposure concern.

Risk level: LOW for privacy, MODERATE for account security.

For someone using Telegram to stay in contact with family, follow public channels, participate in hobby communities, or receive news — the default cloud chat encryption is adequate. Telegram’s server-side encryption protects against passive network interception. The platform does not profile users for advertising. Ordinary conversational content is not at meaningful risk from Telegram’s law enforcement cooperation policy — the 900 US requests fulfilled in 2024 targeted suspected criminals, not ordinary users.

The primary risks at this level are account-level: SIM-swap attacks can compromise accounts without 2FA; phishing links in Telegram groups are common; Telegram bots can be weaponized to harvest credentials. These are addressable through configuration (see below).

Verdict for this profile: Telegram is appropriate. Configure 2FA; enable a passcode lock; be skeptical of unsolicited links.


Profile 2: The Business Professional (Work Communications, Client Data)

Threat model: Corporate espionage, data breach of professional communications, regulatory compliance for data handling.

Risk level: MODERATE to HIGH, depending on content.

Business communications frequently contain commercially sensitive information: deal terms, personnel matters, financial projections, client data. Storing this content in Telegram’s cloud servers — accessible to Telegram and legally requestable by authorities — creates compliance exposure for organizations in regulated industries (healthcare, finance, legal) and commercial sensitivity exposure for any competitive business.

For general work coordination (scheduling, logistics, team updates), Telegram’s risk profile is manageable. For content that would be protected under attorney-client privilege, HIPAA, GDPR, or similar frameworks — Telegram cloud chats are structurally inappropriate as the primary communication channel.

Verdict for this profile: Acceptable for casual work coordination; inappropriate for sensitive commercial or regulated content. Use Secret Chats for any communication you’d hesitate to see on a server outside your organization’s control.


Profile 3: The Journalist or Researcher

Threat model: Source exposure, source identification from metadata, law enforcement requests triggered by journalistic contact with sensitive subjects.

Risk level: HIGH for source protection; MODERATE for journalist’s own safety depending on jurisdiction.

The law enforcement data disclosed by Telegram is IP addresses and phone numbers — exactly the metadata needed to identify a source. A journalist who communicates with a confidential source via Telegram cloud chat has created a communication record on Telegram’s servers that includes the source’s IP address, phone number, and timestamp data. A valid legal request to Telegram could produce that record.

Secret Chats address the E2EE gap but not the metadata gap. A Secret Chat still connects two phone numbers at a specific time from specific IP addresses — and that metadata is accessible to Telegram and potentially requestable by law enforcement.

For source protection in investigative journalism, the EFF’s Surveillance Self-Defense guide and established press freedom organizations including Freedom of the Press Foundation recommend Signal with a device isolated from other apps for source communication. Telegram is not appropriate as the primary secure communication channel for source-sensitive journalism.

Verdict for this profile: Telegram public channels are appropriate for distributing published work. Telegram is inappropriate for communications with confidential sources regardless of configuration.


Profile 4: The Activist, Dissident, or Political Organizer

Threat model: State surveillance, location exposure, group membership identification, law enforcement request from hostile jurisdiction.

Risk level: VERY HIGH in authoritarian contexts; MODERATE in democratic contexts.

This is the profile where the infrastructure accountability question becomes operationally critical. For activists operating in jurisdictions where Russian state intelligence would have interest in their activities — Ukraine, Belarus, Russia, former Soviet states — the potential network-level access to communication metadata documented in the OCCRP/iStories reporting represents a specific, serious threat that no application-layer policy can address.

For activists in democratic contexts facing domestic surveillance concerns, the Telegram transparency data clarifies the threat: law enforcement can obtain IP addresses and phone numbers with valid legal process. Secret Chats protect message content but not the existence of communication.

Verdict for this profile: Telegram is inappropriate for sensitive organizing in high-threat environments. Signal, with disappearing messages enabled and a phone number not tied to real identity (registration with a VoIP or temporary number), provides substantially stronger protection. For extreme high-risk situations, consult a digital safety specialist; the EFF’s Digital Security Helpline provides free assistance to journalists and activists.


Profile 5: The Parent Monitoring Children’s Use

Threat model: Exposure to inappropriate content, contact with predators, scams targeting young users.*

Risk level: MODERATE to HIGH, primarily from platform openness rather than Telegram’s own policies.

Telegram’s large group and channel infrastructure — and its historically lax content moderation — created an ecosystem where harmful content, predatory contact, and scam operations coexisted with legitimate communities. Post-Durov arrest, Telegram has increased content moderation and joined the Internet Watch Foundation (which maintains lists of known child sexual abuse material for detection purposes). The moderation posture has improved; it has not been transformed.

Children using Telegram’s default settings — phone number visible to contacts, approximate location potentially inferrable from IP address ranges, exposure to large public groups — face more ambient risk than on more tightly moderated platforms.

Verdict for this profile: Parental configuration (restricting who can add children to groups, disabling People Nearby, enabling privacy settings that limit who can see phone number) is essential. Consider whether Telegram’s open group ecosystem is appropriate for younger users at all.


Profile 6: The Person Engaged in Illegal Activity

Threat model: Law enforcement identification and prosecution.*

Risk level: CRITICAL. The “Telegram is safe from law enforcement” myth is definitively and permanently dead.

The data is clear: the platform that in the first nine months of 2024 fulfilled 14 US law enforcement requests affecting 108 users fulfilled 900 requests affecting 2,253 users by year-end. Indian authorities received data on 23,535 users in 2024 alone. Criminal groups that selected Telegram specifically for its previous non-cooperation posture were, by their own public acknowledgment, migrating off the platform following the September 2024 policy change.

Telegram can and does provide IP addresses and phone numbers — sufficient for law enforcement identification — in response to valid legal requests. The platform is no longer a viable operational security choice for criminal activity. Including this profile in a public privacy guide is not an endorsement of illegal activity; it’s an accurate representation of a threat model that affects how the platform is perceived and how its previous reputation was constructed.

Telegram Configuration: The Settings That Actually Matter

If you’ve determined that Telegram is appropriate for your use case (most users), these are the configurations that meaningfully improve your security posture. Settings that don’t appear on this list produce marginal improvements.

Step 1: Enable Two-Step Verification — This Is Non-Negotiable

Settings → Privacy and Security → Two-Step Verification → Set Additional Password

Without 2FA, anyone who SIM-swaps your phone number — a well-documented attack requiring only social engineering of your carrier — obtains complete access to your Telegram account including all cloud chat history. With 2FA enabled, they still need the password you set. Set a strong, unique password; add a recovery email.

Step 2: Configure Who Sees Your Phone Number

Settings → Privacy and Security → Phone Number → change “Who can see my phone number” to Nobody and “Who can find me by my number” to My Contacts

Your phone number is your most sensitive identifier on Telegram. Making it visible to everyone makes you trivially linkable to your real-world identity.

Step 3: Set Up Active Session Monitoring

Settings → Privacy and Security → Active Sessions → review all listed sessions

Any session from a device or location you don’t recognize should be terminated immediately. This is particularly important if you use Telegram on shared or work devices.

Step 4: Enable Passcode Lock

Settings → Privacy and Security → Passcode Lock

Prevents access to your Telegram account if someone gains physical access to your unlocked device.

Step 5: Use Secret Chats for Sensitive Conversations

Start a new conversation → tap contact name → “Start Secret Chat”

Remember: Secret Chats don’t sync across devices and can’t be used for groups. They are the only Telegram conversation type that provides genuine E2EE.

Step 6: Review Linked Accounts and Bots

Periodically review which third-party apps and bots have been granted access to your account. Revoke access for any you don’t recognize or actively use.

What Configuration Cannot Fix

No configuration option enables E2EE for group chats. No configuration changes Telegram’s law enforcement cooperation policy. No configuration removes your phone number as the account identifier. These are architectural properties, not settings.

Telegram vs. Signal vs. WhatsApp: The Structured Comparison

The “which is safest” comparison generates more heat than light in most articles. This structured assessment uses the TPRS dimensions from Part 1, applied consistently across platforms.

Encryption Architecture

Signal: E2EE by default for all messages, calls, and groups. Uses the Signal Protocol — the most independently audited messaging encryption protocol in existence. No cloud storage of messages. Complete E2EE for groups. Architecture score: 20/20.

WhatsApp: E2EE by default for all messages, calls, and groups using the Signal Protocol. Cloud backup encryption is optional (Google Drive/iCloud backups were historically unencrypted; end-to-end encrypted backups are now available but require manual enabling). E2EE for groups. Owned by Meta. Architecture score: 15/20 — full marks on encryption defaults; penalty for cloud backup default and Meta ownership’s data integration.

Telegram: Client-server encryption by default; E2EE only in Secret Chats (manual, no groups). Architecture score: 7/20.

Metadata Collection

Signal: Collects the absolute minimum: the phone number used to register (required for SMS verification), and the date the account was created. No message metadata, no contact graph, no usage analytics that leave the device. In the US v. Ross Ulbricht case and subsequent legal proceedings, Signal was served with a legal order and could provide only registration date and last connection date — demonstrating that minimal metadata collection is technically achievable and legally tested. Metadata score: 19/20.

WhatsApp: Requires phone number; Meta has access to contact graph, usage patterns, device information, and behavioral signals used for Meta’s cross-platform advertising and content personalization systems. Does not read message content (E2EE prevents this), but collects substantial non-content metadata. GDPR enforcement actions against Meta/WhatsApp in Europe have resulted in significant fines. Metadata score: 9/20.

Telegram: Phone number required; IP address logged; session data collected; usage analytics gathered. Less invasive than WhatsApp’s Meta-integration; more invasive than Signal. Metadata score: 8/20.

Law Enforcement Cooperation Track Record

Signal: By design, has almost nothing to provide. The technical architecture means Signal cannot produce message content, contact graphs, or communication history regardless of legal request. Legal demands return: account creation date, last connection date. No IP address stored long-term. Cooperation score: 18/20 — highest possible given that some minimal data technically exists.

WhatsApp: Provides metadata in response to valid legal requests: account registration data, IP address, usage data. Does not provide message content (E2EE prevents this). Complies with legal requests from any jurisdiction with valid process. Cooperation score: 12/20 — more cooperative than Signal by technical necessity; better than Telegram on message content protection.

Telegram: Pre-September 2024: among the least cooperative of major platforms. Post-September 2024: now provides IP addresses and phone numbers in response to valid criminal legal requests from any jurisdiction. Cooperation score: 9/20.

Overall Platform Comparison Summary

DimensionSignalWhatsAppTelegram
Encryption Architecture20/2015/207/20
Metadata Minimization19/209/208/20
Law Enforcement Cooperation18/2012/209/20
Code/Protocol Transparency18/2011/2010/20
Infrastructure Accountability17/2010/206/20
Account Security Defaults15/2014/208/20
TPRS Total87/10071/10048/100

Signal’s privacy advantage over Telegram is not marginal. It’s architectural. Signal was designed from inception to minimize what the platform can know about its users. Telegram was designed to maximize convenience and feature richness, with privacy as a secondary consideration expressed through opt-in features.

WhatsApp’s score is dragged down by Meta’s metadata collection and advertising integration, despite having stronger default encryption than Telegram. The tradeoff: WhatsApp’s 2 billion users provide network effects that neither Signal nor Telegram can match for reaching ordinary contacts who haven’t made deliberate messaging app choices.


Frequently Asked Questions

Is Telegram end-to-end encrypted?

Partially and optionally. Default cloud chats are encrypted between your device and Telegram’s servers — Telegram can access cloud chat content. Secret Chats use genuine end-to-end encryption, but must be initiated manually, don’t sync across devices, and are unavailable for group conversations. This distinction is the most important fact about Telegram’s security, and it’s one that a significant proportion of Telegram’s users don’t know.

Did Telegram change its privacy policy in 2024?

Yes, materially. In September 2024, following founder Pavel Durov’s arrest in France, Telegram expanded its law enforcement data disclosure policy from terrorism-only to criminal activity broadly. The immediate quantitative effect: US law enforcement fulfilled requests went from 14 in January–September 2024 to 900 for the full year, with the vast majority concentrated in Q4 2024.

Can the government read my Telegram messages?

It depends. For Secret Chats: no — E2EE prevents Telegram from decrypting them, so there is nothing to provide to authorities. For cloud chats: technically yes — Telegram can decrypt cloud chat messages and could be compelled to produce them under valid legal orders. Additionally, governments can obtain IP addresses and phone numbers through Telegram’s current cooperation policy, enabling identification even without message content.

Is Telegram safer than WhatsApp?

For privacy from Meta’s advertising infrastructure: yes — Telegram doesn’t build behavioral profiles or use your data for ad targeting. For encryption of your messages: no — WhatsApp uses the Signal Protocol for E2EE by default on all messages including groups; Telegram doesn’t encrypt groups at all and only provides E2EE through opt-in Secret Chats.

Can I make Telegram more secure?

Yes, meaningfully. Enable two-step verification — this alone addresses the most common account takeover vector (SIM swap). Restrict phone number visibility. Use Secret Chats for sensitive conversations. These steps improve your security profile substantially. They do not change the fundamental architecture: cloud chats remain accessible to Telegram; the law enforcement cooperation policy remains in place.

Is Signal better than Telegram for privacy?

Substantially, for users who prioritize privacy. Signal provides E2EE by default for all message types including groups, collects minimal metadata (only registration date and last connection date have been verified through legal proceedings to be what Signal can actually produce), and has an open-source codebase with an independently audited protocol. The tradeoff: Signal has fewer features — no channels, smaller group limits, no cloud sync — and a smaller user base requiring deliberate adoption by your contacts.

What happened to Pavel Durov?

Durov was arrested at Paris–Le Bourget Airport on August 24, 2024, indicted on twelve charges including complicity in facilitating criminal transactions and child sexual exploitation through Telegram’s moderation failures, posted €5 million bail, and was initially banned from leaving France. In November 2025, France fully lifted his travel ban. The formal investigation remains open as of May 2026; no trial date has been set.

Is Telegram connected to Russia?

Telegram was founded by Russian citizens and the first version of the application was developed while its founders were in Russia. Pavel Durov left Russia in 2014 after refusing to hand over data to Russian authorities, and Telegram is currently headquartered in Dubai with global server infrastructure. The complicating factor: investigative reporting identified that key network infrastructure components were managed by entities with documented connections to individuals who have long-standing relationships with Russian state institutions. Telegram disputes characterizations of Russian state influence. Independent verification of infrastructure governance is not available given Telegram’s closed server-side architecture.

Should I use Telegram for work communications?

For casual work coordination: acceptable with proper configuration. For sensitive business communications — deals, personnel matters, client data, legally privileged information — cloud chats are structurally inappropriate. Secret Chats provide E2EE for one-on-one conversations but cannot be used for group work communications. Enterprise organizations should evaluate Signal or purpose-built secure communications platforms (Wire for Business, Wickr Enterprise) for sensitive content.

What does Telegram store that law enforcement could request?

Based on Telegram’s own privacy policy and disclosed transparency data: IP addresses logged at connection time, phone numbers (required for account registration), account creation date, last connection information, and session data (what devices have been logged in, when, with location approximation). Telegram explicitly does not store — and cannot produce — decrypted Secret Chat message content. Cloud chat content is technically accessible to Telegram and could be subject to legal requests, though there is no documented instance of cloud chat content being produced in response to a law enforcement request.


Final Verdict: Is Telegram Safe in 2026?

The honest answer has four parts, one for each distinct audience:

For the vast majority of users — casual messaging, communities, channels: Telegram is safe enough, with one essential configuration step: enable two-step verification before doing anything else. The platform doesn’t profile you for advertising, the server-side encryption is adequate for ordinary conversations, and your content isn’t at meaningful risk from law enforcement unless you’re suspected of criminal activity.

For business and professional use: Telegram is appropriate for logistics and coordination, not for sensitive content. If the conversation would be embarrassing or legally problematic on a server you don’t control, use Secret Chat or a different platform.

For journalists, activists, dissidents, and high-risk users: Telegram is the wrong tool. The infrastructure accountability concerns, the post-2024 law enforcement cooperation policy, and the fundamental absence of default E2EE make it structurally unsuitable for the threat models these users face. Signal — properly configured, with a phone number not tied to real identity — is the correct choice. The EFF’s Surveillance Self-Defense guide provides specific, operationally tested guidance for high-risk contexts.

For anyone who believed the pre-2024 “Telegram doesn’t cooperate with governments” narrative: That narrative is gone. The platform that emerged from the Durov arrest is cooperative on criminal matters, provides IP addresses and phone numbers in response to valid legal requests across multiple jurisdictions, and has permanently changed its moderation posture. Telegram’s current privacy profile is honest, transparently disclosed in its privacy policy, and meaningfully different from what it was three years ago. Decisions made on the old narrative should be revisited.

The TPRS score of 48/100 captures exactly this: a platform with real security capabilities (Secret Chats, good account security features, no advertising surveillance) that falls substantially short of its privacy brand because of architectural defaults and a law enforcement policy that was fundamentally transformed in 2024. Neither “Telegram is a privacy nightmare” nor “Telegram is secure” is an accurate characterization. “Telegram is conditionally safe, with specific and documented limitations” is.


Recent Posts

Best Workflow Automation Tools 2026: Complete Business Guide

Best Workflow Automation Tools 2026 Updated May 2026 Quick Answer: The best workflow automation tool for your business d

Best Penetration Testing Tools 2026: Complete Guide + PTEM Framework

Best Penetration Testing Tools 2026 Updated May 2026 Quick Answer: The professional penetration testing toolkit in 2026

Best Firewall Software for Business in 2026: Tested and Ranked

Best Firewall Software 2026 Updated May 2026 Quick Answer: The best firewall software for most businesses in 2026 is For