When qualifying privacy recommendations with context, I think we should go further than describing threat models: we should acknowledge different types of privacy. “Privacy” means different things to different people. Even a single person may use the word “privacy” differently depending on their situation. Understanding a user’s unique situation(s), including their threat models, can inform us when we select the best of approach. How do we choose between reducing a footprint’s spread and size?
I think this is excellent thought and even if it ultimately the insight can be accommodated within the framework of threat models, it's useful as an architectonic principle. Privacy to me is a cluster concept, covering concerns as varying as state surveillance, confidentiality in therapy and being able to sit on the loo in peace!
That said, I think the central distinction of the piece is stated in terms that could be more helpful:
I highlight two main approaches to privacy: “tracking reduction” and “tracking evasion”.
Approach, I fear, is the wrong term and too general as well. TR and TE seem to be general privacy strategies. Strategy is a term that also avoids an exact definition, but a helpful starting point might be that a privacy strategy would consist of privacy objectives, the ways they can be achieved and the resources employed. Since the ways of undermining privacy are quite similar (the internet is a mostly open platform with often untrustworthy agents that are hyperconnected) and the means (computer software and hardware) are similar, making the distinctions primarily a matter of privacy objectives and secondarily of the other factors seems most prudent to me.
My second concern is how 'data' is employed in the definitions of TR and TE. Reading the main text it seems to me that leaking less data is not the point of tracking evasion, but rather reducing the range of inferences that may be done with, especially avoiding deanonymization. This isn't some monotonically decreasing function of how much data is being collected.
A downstream of this is that distinction between wants and need such as in the passage
In other words, TR falls closer to “wants” on the (somewhat contrived) “wants versus needs” spectrum
mostly loses its force. I block adverts online mostly to avoid malware, which is a low probability threat never-the-less a pain to deal with since I need access to a computer to get my needs met. Very casual means suffice to accomplish this, but it's not a mere want that is out of synchrony with my needs.
Moonside wrote
I think this is excellent thought and even if it ultimately the insight can be accommodated within the framework of threat models, it's useful as an architectonic principle. Privacy to me is a cluster concept, covering concerns as varying as state surveillance, confidentiality in therapy and being able to sit on the loo in peace!
That said, I think the central distinction of the piece is stated in terms that could be more helpful:
Approach, I fear, is the wrong term and too general as well. TR and TE seem to be general privacy strategies. Strategy is a term that also avoids an exact definition, but a helpful starting point might be that a privacy strategy would consist of privacy objectives, the ways they can be achieved and the resources employed. Since the ways of undermining privacy are quite similar (the internet is a mostly open platform with often untrustworthy agents that are hyperconnected) and the means (computer software and hardware) are similar, making the distinctions primarily a matter of privacy objectives and secondarily of the other factors seems most prudent to me.
My second concern is how 'data' is employed in the definitions of TR and TE. Reading the main text it seems to me that leaking less data is not the point of tracking evasion, but rather reducing the range of inferences that may be done with, especially avoiding deanonymization. This isn't some monotonically decreasing function of how much data is being collected.
A downstream of this is that distinction between wants and need such as in the passage
mostly loses its force. I block adverts online mostly to avoid malware, which is a low probability threat never-the-less a pain to deal with since I need access to a computer to get my needs met. Very casual means suffice to accomplish this, but it's not a mere want that is out of synchrony with my needs.