pull down to refresh

This interesting paper was referenced in the latest Spiral newsletter and it really got me thinking. It's a little dense, so I'm doing my best to summarize here:
If you use tor to obscure your activity on the internet, clearly you are already leaking some information by using tor. You might assume that your anonymity set is all the people who use tor, or more specifically all the people who used tor at the same time as you. One common way to measure your anonymity is to speak of bits of entropy.
However, the author of this paper argues that this is a misunderstanding of the situation. Not all tor entry nodes are benevolent actors. What if you connect to a malicious entry node? Additionally, what if this node also has access to the exit node you end up using? In that case it is trivial to track your activities.
Another example: if you used a specific coordinator to do coinjoins, you might think your anonymity set is all coinjoins arranged by that coordinator. However, if the legal names of the people who run the coordinator are known and they happen to keep logs of the coordinator at their place of business or residence, it might not be that difficult for an attacker to raid them and seize such logs. In which case your anonymity set is significantly less than you thought.
The author of the paper argues that we need to better ways to talk about anonymity that include the amount of effort an attacker must exert to reduce your anonymity. In the case of tor, an attacker must exert quite a lot of effort if you only ever use trusted entry-nodes run by a group of your friends. But if you randomly connect to entry nodes, the attacker may need to exert much less effort.
[The adversary] strives to reduce his uncertainty about linking senders and recipients, but the measure of his ability to succeed is not the size of those sets or probability distributions on them. Rather it is the resources it will take him to place himself in a position to learn those linking relations.
[The system of measuring anonymity that focuses on the size of the set of known senders or receivers or utxos in a coinjoin--called the entropist approach] will yield system designs that might be useful in controlled settings like elections, where we can control or predict the number and nonidentity of participants and where anonymity within expected-size sets is useful. But the entropist approach is not appropriate for general communication on large diversely shared networks like the internet
Anonymity is really hard. Reminds me yet again how impressive it is that we still don't know who Satoshi is.
102 sats \ 1 reply \ @kruw 17h
Another example: if you used a specific coordinator to do coinjoins, you might think your anonymity set is all coinjoins arranged by that coordinator. However, if the legal names of the people who run the coordinator are known and they happen to keep logs of the coordinator at their place of business or residence, it might not be that difficult for an attacker to raid them and seize such logs. In which case your anonymity set is significantly less than you thought.
Clients are designed not to share any data with coinjoin coordinators. However, an operator could keep logs of metadata (timing of registrations/deregistrations), which is why you should avoid failing too many rounds.
reply
This is helpful, thanks. I had it in my mind that a coordinator could do some amount of work to deanonymize things if they had a complete list of all the transactions and where they entered from.
In my mind it is kind of like the For entry node situation. If a coordinator knows the address you came from and then that info is cross referenced with some gov't kyc list, wouldn't that allow them to figure out a fair bit about you?
Also, I should note I was primarily thinking of the Samurai 5 utxo round model, not as much the Wabi Sabi model.
reply
Is your anonymity set what you think it is?
Almost certainly not :(
reply
Reminds me yet again how impressive it is that we still don't know who Satoshi is.
truuuuu-lyyyy
reply

Privacy Is Boring, Until You Lose It

Privacy doesn’t sell. It doesn’t trend. It doesn’t give you dopamine hits or push notifications. Compared to sleek new features or viral posts, it feels outdated, like locking your door in a neighborhood where no one’s been robbed. Until you are.
Then it gets real.
[...]
reply
You are absolutely right in what you say, we often think we are safe until our carelessness catches up with us.
reply
I like to take care of my safety and that of my SATS! This was an interesting read! Thanks for sharing ⚡
reply
feels like most people confuse plausible deniability with real anonymity. tor, coinjoins, whatever, it’s all just delaying correlation unless you actually control the edges. if your adversary has time, money, and legal reach, your “anon set” is probably just a list they’re waiting to cross reference later.
reply
Great post. Most people confuse "obscurity" with true anonymity. If your threat model includes nation-states or well-resourced actors, your "anonymity set" can collapse fast. Trusting entry nodes or central coordinators is a gamble—entropy doesn't equal safety.
reply