Personal privacy is a fairly new concept. Most people used to live in tight-knit communities, constantly enmeshed in each other's lives. The notion that privacy is an important part of personal security is even newer, and often contested, while the need for public security -- walls which must be guarded, doors which must be kept locked -- is undisputed. Even anti-state anarchists concede the existence of violent enemies and monsters.
Rich people can afford their own high walls and closed doors. Privacy has long been a luxury, and it's still often treated that way; a disposable asset, nice-to-have, not essential. Reinforcing that attitude is the fact that it's surprisingly easy, even instinctive, for human beings to live in a small community -- anything below Dunbar's Number -- with very little privacy. Even I, a card-carrying semi-misanthropic introvert, have done that for months at a stretch and found it unexpectedly, disconcertingly natural.
And so when technological security is treated as a trade-off between public security and privacy, as it almost always is these days, the primacy of the former is accepted. Consider the constant demands for "golden key" back doors so that governments can access encrypted phones which are "going dark." Its opponents focus on the fact that such a system will inevitably be vulnerable to bad actors -- hackers, stalkers, "evil maids." Few dare suggest that, even if a perfect magical golden key with no vulnerabilities existed, one which could only be used by government officials within their official remit, the question of whether it should be implemented would still be morally complex.
Consider license plate readers that soon enough will probably track the locations of most cars in California in near-real-time with remarkable precision. Consider how the Golden State Killer was identified, by trawling through public genetic data to look for family matches; as FiveThirtyEight puts it, "you can't opt out of sharing your data, even if you didn't opt in" any more. Which would be basically fine, as long as we can guarantee hackers don't get their hands on that data, right? Public security -- catching criminals, preventing terror attacks -- is far more important than personal privacy. Right?
Consider too corporate security, which, like public security, is inevitably assumed to be far more important than personal privacy. Until recently, Signal, the world's premier private messaging app, used a technical trick known as "domain fronting," on Google and Amazon web services, to provide access in countries which had tried to ban it -- until this month, when Google disabled domain fronting and Amazon threatened termination of their AWS account, because the privacy of vulnerable populations is not important to them. Consider Facebook's countless subtle assaults on personal privacy, in the name of connecting people, which happens to be how Facebook becomes ever stronger and more inescapable, while maintaining much stronger controls for its own employees and data.
But even strict corporate secrecy just reinforces the notion that privacy is a luxury for the rich and powerful, an inessential. It wouldn't make that much difference if Amazon or Facebook or Google or even Apple were to open up their books and their roadmaps. Similarly, it won't make that much difference if ordinary people have to give up their privacy in the name of public security, right? Living in communities where everyone knows one another's business is natural, and arguably healthier than the disjoint dysfunction of, say, an apartment building whose dozens of inhabitants don't even know each other's names. Public security is essential; privacy is nice-to-have.
...Except this dichotomy between "personal privacy" and "public security," all too often promulgated by people who should know better, is completely false, a classic motte-and-bailey argument in bad faith. When we talk about "personal privacy" in the context of phone data, or license plate readers, or genetic data, or encrypted messaging, we're not talking about anything even remotely like our instinctive human understanding of "privacy," that of a luxury for the rich, inessential for people in healthy close-knit communities. Instead we're talking about the collection and use of personal data at scale; governments and corporations accumulating massive amounts of highly personal information from billions of people.
This accumulation of data is, in and of itself, not a "personal privacy" issue, but a massive public security problem.
At least three problems, in fact. One is that the lack of privacy has a chilling effect on dissidence and original thought. Private spaces are the experimental petri dishes for societies. If you know your every move can be watched, and your every communication can be monitored, so private spaces effectively don't exist, you're much less likely to experiment with anything edgy or controversial; and in this era of cameras everywhere, facial recognition, gait recognition, license plate readers, Stingrays, etc., your every move can be watched.
If you don't like the ethos of your tiny community, you can move to another one whose ethos you do like, but it's a whole lot harder to change nation-states. Remember when marijuana and homosexuality were illegal in the West? (As they still are, in many places.) Would that have changed if ubiquitous surveillance and at-scale enforcement of those laws had been possible, back then? Are we so certain that all of our laws are perfect and just today, and that we will respond to new technologies by immediately regulating them with farsighted wisdom? I'm not. I'm anything but.
A second problem is that privacy eradication for the masses, coupled with privacy for the rich, will, as always, help to perpetuate status-quo laws / standards / establishments, and encourage parasitism, corruption, and crony capitalism. Cardinal Richelieu famously said, "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged." Imagine how much easier it gets if the establishment has access to everything any dissident has ever said and done, while maintaining their own privacy. How long before "anti-terrorism" privacy eradication becomes "selective enforcement of unjust laws" becomes "de facto 'oppo research' unleashed on anyone who challenges the status quo"?
A third problem is that technology keeps getting better and better at manipulating the public based on their private data. Do you think ads are bad now? Once AIs start optimizing the advertising → behavior → data feedback loop, you may well like the ads you see, probably on a primal, mammalian, limbic level. Proponents argue that this is obviously better than disliking them. But the propaganda → behavior → data loop is no different from advertising → behavior → data, and no less subject to "optimization."
When accumulated private data can be used to manipulate public opinion on a massive scale, privacy is no longer a personal luxury. When the rich establishment can use asymmetric privacy to discredit dissidents while remaining opaque themselves, privacy is no longer a personal luxury. When constant surveillance, or the threat thereof, systematically chills and dissuades people from experimenting with new ideas and expressing contentious thoughts, privacy is no longer a personal luxury. And that, I fear, is the world we may live in soon enough, if we don't already.