Is “KAX17” performing de-anonymization Attacks against Tor Users?

Hashtag: #KAX17

Two years ago in December 2019, I first wrote about a particular and unusual malicious actor on the tor network. This blog post is about how that actor expanded their visibility into the tor network during the last two years after their removal by the tor directory authorities in October 2019 and why this particular actor is more concerning than the usual malicious tor relay group.

The threat landscape on the tor network motivates the second part, in which we will outline a design and proof of concept implementation to help tor users defend themselves and significantly reduce their risk of using malicious tor relays without requiring the identification of malicious relays — a problem that has become impractical to tackle.

Major Tor Network Threat Actors

Actor “BTCMITM20” Profile

  • sophistication: amateur level but persistent and large scale
  • operated relay types: exit relays
  • (known) concurrently running relays peak: >350 relays
  • (known) advertised bandwidth capacity peak: 40 Gbit/s
  • (known) exit probability peak: 27%
  • primary motivation: financial profit (by replacing bitcoin addresses in tor exit traffic)
  • defenses: easy; HSTS preloading for website operators; on tor clients: ensure HTTPS is used properly.

past blog posts about this actor:

Actor “KAX17” Profile

  • sophistication: non-amateur level and persistent
  • uses large amounts of servers across many (>50) autonomous systems (including non-cheap cloud hosters like Microsoft)
  • operated relay types: mainly non-exits relays (entry guards and middle relays) and to a lesser extend tor exit relays
  • (known) concurrently running relays peak: >900 relays
  • (known) advertised bandwidth capacity peak: 155 Gbit/s
  • (known) probability to use KAX17 as first hop (guard) peak: 16%
  • (known) probability to use KAX17 as second hop (middle) peak: 35%
  • motivation: unknown; plausible: Sybil attack; collection of tor client and/or onion service IP addresses; deanonymization of tor users and/or onion services

past blog post about KAX17:

We consider it less likely that KAX17 and BTCMITM20 are the same actor, but due to some minor overlap we did not rule out the possibility that there is some limited form of collaboration between these actors yet. The remainder of this blog post is about KAX17 only.

What visibility into the tor network did KAX17 have during the past 3 years?

Figure 1: Guard, middle and exit probability by KAX17's relays between 2019–01–01 and the removal event on 2021–11–08. Graph by nusenu (raw data source: Tor Project/onionoo)

After I reported the exit relays (at the time I did not know they are part of KAX17) they got removed in October 2020, but I do not believe that halted their exit operations completely. Coincidentally a new large no-name exit relay group was born the day after their removal. That new group is not included in figure 1 because it can not be attributed to KAX17 using the same strong indicator.

To provide a worst-case snapshot, on 2020–09–08 KAX17's overall tor network visibility would allow them to de-anonymize tor users with the following probabilities:

  • first hop probability (guard) : 10.34%
  • second hop probability (middle): 24.33%
  • last hop probability (exit): 4.6%

As middle and exit relays are frequently changed the likelihood to use KAX17's relays increases with tor usage over time. We have no evidence, that they are actually performing de-anonymization attacks, but they are in a position to do so and the fact that someone runs such a large network fraction of relays “doing things” that ordinary relays can not do (intentionally vague), is enough to ring all kinds of alarm bells.

In the course of 2020 large amounts of suspicious non-exit relays joined the network and were reported to the Tor Project, but since they no longer got removed, I sent them to the public tor-talk mailing list as their capacity continued to increase (2020–08–20, 2020–09–22).

The unexpected hint towards a better understanding of the mystery

  • are in fact operated by a single entity and
  • all of them are actually part of KAX17.

What is special about KAX17?

KAX17's involvement in tor-relays policy discussions

Self-defense: Helping tor users help themselves

In the past I have always been reluctant to make tor client configuration changes that affect path selection because it makes a tor client theoretically stand out, but in the light of the tor network’s threat landscape I consider it a reasonable (for some threat model even a necessary) act of self-defense to stop using untrusted relays for certain path positions to reduce the risk of de-anonymization and other types of attacks — even if that is a non-default configuration.

To achieve that goal tor clients would need to:

(1) configure trusted operators or learn about them via so called trust anchors

(2) automatically enumerate all relays of trusted operators

(3) automatically configure the tor client to use only trusted relays for certain positions like entry guard and/or exit relay

The design allows tor users to assigns trust at the operator level and inherits that trust to all the relays an operator manages. This should ensure scalability and be less fragile to changes like when new relays get added or replaced.

Non-spoofable operator identifiers

  1. add proof:uri-rsa ciissversion:2
    to her relay’s ContactInfo
  2. publish the relay’s fingerprint at the IANA registered well-known URI:
    (or create a DNS TXT record).

In the past few months the proven domain has already been widely implemented by most large exit operators of the tor relay community and currently over 50% of the tor network’s exit capacity is covered (more is better):

Figure 2: >50% of the tor network’s exit capacity has proven their domain according to the CIISS specification. Source: nusenu (an interactive version of the graph can be found at OrNetStats)

This provides tor users with non-spoofable automatically verifiable operator identifiers they can assign trust to. It is important to stress the point that proven domains are not implicitly trusted, malicious groups can also proof their domain. It is only the first step, an identifier that users can choose to trust. The adoption of the proven operator domain for guard relays is significantly lower (~10% guard probability), until that fraction increases users could configure a trusted relay as a bridge to reduce their chance of using malicious guards.

Trusting operator domains

Proof of concept implementation

It is implemented as a python script that talks to the local tor client via its control port/socket, reads the local file with a list of trusted operator domains, verifies relay/domain proofs (via tor) and configures the tor client to use exit relays run by trusted operators only. The list of trusted operators is defined by the user.

We would like to implement an actual serious implementation and we might have an update on that within the next months.


  • KAX17 has been running relays in all positions of a tor circuit (guard, middle and exit) across many autonomous systems putting them in a position to de-anonymize some tor users.
  • Their actions and motives are not well understood.
  • We found strong indicators that a KAX17 linked email address got involved in tor-relays mailing list discussions related to fighting malicious relays.
  • Detecting and removing malicious tor relays from the network has become an impractical problem to solve.
  • We presented a design and proof of concept implementation towards better self-defense options for tor clients to reduce their risk from malicious relays without requiring their detection.
  • Most of the tor network’s exit capacity (>50%) supports that design already. More guard relays adopting the proven domain are needed (currently at around 10%).



Figure 3: Running KAX17 relays and their advertised bandwidth since 2019–01–01. Graph by nusenu
Figure 4: Running KAX17 middle-only relays show a monthly pattern. Starting on the first day of each month. Graph by nusenu
Figure 5: Running KAX17 guard relays do not show the same monthly pattern (also because the guard flag has some requirements that relays regularly disappearing do not meet). Graph by nusenu



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store