Stake with us
Privacy I - Why we need privacy
National security, individual safety, the protection of liberal societies, and democratic safeguards
POSTED ON: 15.11.2022
Errol Drummond
Exlusive content by Zero Knowledge Natives
Privacy, a dangerous necessity
Privacy has been a hot talking point in crypto for a while, with many projects raising vast amounts of funding by explaining how they will achieve privacy; the underlying assumption is that privacy is good. But did you ever wonder if you were convinced about the truth of that?
If you had to try to convince somebody, would you be able to argue for the benefits of privacy? Would you have anything to say about the dangers of privacy? Would you have anything to say about the philosophical standing we should take when creating regulation? If you want to gain a new perspective on this topic, this article series helps provide a perspective for why we need privacy for our safety, what the dangers of privacy are, what safeguards we need to protect against these dangers, and what philosophical basis we could take.
In truth there is a huge amount to discuss and think about regarding privacy, the ideas presented here are only a starting point.
Why we need privacy
There are many reasons to consider for why privacy is needed; we will explore examples related to national security, individual safety, the protection of liberal societies, and democratic safeguards. These lines of reasoning and examples of failings help elucidate how a lapse in privacy puts us all in danger.
National security
Our adoption of privacy measures is somewhat like our adoption of public health practices - you’re in more danger if others don’t care for their hygiene or get a vaccine, regardless of all the precautions you have taken. Privacy lapses in your society that have nothing to do with you can allow attackers to compromise your security. This manifests itself in many ways, the first is in the realm of national security.
As an example let’s explore the case of the Strava, an app that lets you track your running routes and share your progress with others. What’s the danger to national security here? Analysts realised they could explore public heatmaps of running routes released by Strava to identify secret military bases. Taking it further, it was shown that you could then track some of these people to the next secret base or missile base they were rotated to. Similar data was used to identify patrol routes on the Turkish-Syrian border. Our public data can be used in so many ways that we never imagined, and actors aligned against us can make use of this data if we don’t protect ourselves.
The second example is that of Grindr. This is a dating app primarily for gay people that takes an email address and also your exact location so that it can tell other users how far away you are in metres. Horny social media or national security issue? Well, it’s both. In fact the danger of Grindr is a negative externality of the positive. In Grindr people will have some of their filthiest conversations ever, and also some of their dirtiest photos ever.
Grindr was sold to a Chinese company in 2016, meaning the Chinese government could most likely access any saved data if they wished. Do you think any of this data could be used to exploit people? Do you think the specific location data could be used on identified military personnel to track them? These issues abound.
The final example to mention here is the use of a network of fake accounts on LinkedIn that reached out to various government employees and former employees from the UK, US, Germany, France, etc. by China, with offers of getting them to come to China to link them up with wealthy people. In reality this was attempted espionage. One of the most chilling aspects of this is that some of these people actively advertised having security clearances because it would help them in their job search. Public data is a powerful tool in attempting to find most secret information or even get yourself some inside spies.
Imagine that any of these people agreed to go to China, and upon being in China they were shown extremely private data that could be used to manipulate them. Perhaps their dirty chats or photos on Grindr, or in the context of crypto perhaps their payments helped foreign intelligence services conclude (and find evidence of) an affair, or other blackmail worthy leverage. What if any of these people had family living in China?
The examples presented here are largely not crypto specific, but they illustrate how data shared for good reasons can be used against us. Not only is the same true for crypto, where any public sharing of info could potentially be used to put ourselves or our nation in danger, but non-private blockchains would give these bad actors another huge dataset to add to all the other vulnerabilities already out there. And financial data is extremely personal, it reveals a lot about you that cannot be seen from other public data.
Protecting liberal democracies & individual safety
One of the core tenets of liberal societies is freedom of expression. If all Web3 actions are publicly visible, you are going to limit what you do due to the fact that you know you may be treated differently for it, or you won’t want to risk embarrassment. But you should be entitled to act as you please within the realms of the law; we need privacy.
Let’s turn to a very sinister example. Jacobus Lambertus Lentz was the Dutch inspector of Population Registrie. He loved statistics. Two months prior to the Nazi invasion of the Netherlands, he proposed a personal identity system wherein all citizens would be required to carry an ID card. His idea was rejected on the basis that it was against Dutch democratic traditions. A few months later, when he proposed the idea to the Nazis, they didn’t say no. A J was stamped on each card carried by a Jew; can you think of a simpler way for a Nazi soldier to identify Jews in person?
At the same time, the chief of the General Statistics Office of France made it clear to the Nazis they had no idea how many Jews were living in France, or who they were. France also did not have extensive infrastructure for collecting data like the Netherlands, making any such data gathering extremely tricky. The 1941 census attempted to find Jews within the population. Little did the Nazis know that Rene Carmille, the guy in charge of the census, was a member of the French resistance. The census helped the resistance generate 20,000 fake identities, and identify people who might join their cause. The data on who was Jewish was never even tabulated, Carmille never intended to hand over such info to the Nazis. Carmille was eventually discovered in 1944; he died in Daschau, where he was sent.
How did accessibility to and use of personal data impact the oppression of Jews? In the Netherlands, there were an estimated 140 thousand Jews. 107 thousand of them were deported, and 102 thousand of them were murdered. That is a terrifying 73%, the highest in occupied Europe. Of an estimated 300 to 350 thousand Jews in France, 85 thousand were deported and 82 thousand killed - a murder rate of 25%.
It doesn’t take much imagination to realise how dangerous our Web3 data would be in the hands of an oppressive state. And that is not just foreign states; our own democracies only survive as long as we look after them. Any of our wobbles, our experiments with more autocratic governments, could find stability. We should do everything we can to prevent support structures for such regimes.
Democratic safety
We come now to what is arguably the worst danger, that of of the manipulation of our democracies. This could be done by any group, be it a foreign government or a powerful company. This danger can manifest itself in many ways, such as providing funding for various extreme organisations to help them sow discord and polarise public discussions. But the form we will focus on here involves swaying elections through targeted disinformation campaigns wherein the exact story shown to us was determined to be the most likely to sway us based on individual models about us.
It is important to note here that you do not need to be able to sway many people, you can flip an election by only succeeding with a couple percent of the population. These elections that are successfully interfered with also make further interference more likely to succeed. For example, if there is a minister who accepts bribes from a foreign government, then the more power this minster gets, the more sway they will have to help other such ministers rise to power too. And for any rights that are taken away, the difficulty of getting those rights back is far, far harder than the difficulty of removing them. With fewer rights comes less leverage to contest bad actors.
Let’s turn to one of the poster children for this, that of Aggregate IQ (AIQ, related to Cambridge Analytica) and Brexit. You’re probably aware that the UK’s referendum to leave the European Union won by the narrow margin of 52% to 48%. AIQ was a company that used troves of public data on British citizens provided by Facebook to create individual profiles on people, determine what information might sway each individual, and then send such info as ads in a targeted way so that only that person would see that ad. These ads could present two completely opposing stories to different people, regardless of their truthfulness or inconsistency.
Cambridge Analytica (CA) first got 270,000 facebook users to fill in psychometric tests to determine their personality types - these users were rewarded with between $1 and $2. CA then used all of FB’s data on these individuals to find relationships between their personality types and how they interacted with content on FB. CA then managed to retrieve data on the FB friends of all these FB users, and from their interaction with content on FB make projections for what their personality types were. These models and data were combined with public data on others from other data brokers. The actions of individuals to fill out a psychometric test, and their public data, was used to build models through which CA might be able to sway you too - your privacy is not enough, we need privacy *for all* to ensure our safety.
This data and these models were used to send targeted ads related to Trump’s election and the Brexit referendum in the UK. Regardless of your position on these topics, surely you agree that we should not allow any form of citizen manipulation of this manner? Public debate is essential for these public decisions, but presenting completely different stories to each individual results in a different perceived reality for each of us; how can we work together to make joint decisions if we all see something completely different?
Creating safety
The more data that is out there, the better these models can be made. And the more types of data that reveal aspects of our personalities, the more effective they can become. Combining social media activity with your location data and then adding in all your financial activity can breed something more powerful than before.
It seems like these models can only get better - but that’s not true. These models become less useful when the data they are based off of gets older. So what can we do to ensure safety?
Of course the primary argument is the need for privacy-first blockchains such as Aleo; but that is not enough, there are still other things we should think about. The first is awareness of these dangers and how you might be susceptible, so that you can be aware and recognise whenever you or your friends begin to fall into any of these traps. The second thing we should think about, and this will be the focus of the next article, is a need to strengthen our institutions and regulations.
In 2020 Britain’s Intelligence and Security Committee’s report into Russian interference in the Brexit referendum concluded that
“The written evidence provided to us appeared to suggest that HMG Her Majesty’s Government had not seen or sought evidence of successful interference in UK democratic processes or any activity that has had a material impact on an election, for example influencing results.”
The British government didn’t event try to check whether there had been interference. If our institutions aren’t trying to protect themselves, and if we have no mechanism through which to force such an investigation, then we can do nothing to ensure we continue to live in safe and free nations; at least nothing other than naive hope. In the realm of crypto, we are going to need regulation to ensure that bad actors don’t abuse this privacy so much that it does more harm than good.
Further reading
Stay tuned for the next article, and if you are interested in developing our democracies to give people a say, so that they could, for example, force a general election or an investigation into election meddling, check out Dynamic Democracy .
Many of the examples presented here were found in Privacy is Power by Carissa Véliz, it’s worth a read if you’re interested in seeing a wider picture of the topic of privacy
Errol Drummond
Exlusive content by Zero Knowledge Natives
node101
Learn more on node101
Loading