Why Privacy Failed - ZK Part 2

April 18, 2023

zee-prime-blog-zk.png

Written by: @luffistotle

Designing Better Privacy

I ask myself this question quite frequently. Why has privacy failed to date? Both as a concept and a practical matter we see a general lack of urgency. When you talk to somebody, few - if any - do not support the idea of privacy, and yet the adoption of privacy-preserving measures is largely under tapped. By this, I mean across all of the web, not just Web 3.

While privacy is a more nebulous concept, in Web 3 we can generally distill the space into two verticals: private blockchains (largely financial) and private data (personal data ownership). There are Monero Zealots out there, but for the most part privacy projects are abject failures (in terms of adoption) despite this inherent expression of desire for it.

Why is there so much religious imagery in Zee Prime writing?

The most common rebuttal to privacy stans comes in the following:

“If you’re not doing anything wrong why do you care?”

It is a perfect strawman argument, and frankly, one that is hard to disarm. It comes off as powerful and cleverly shifts the Overton window to a default of little privacy. Removing the necessary question of whether institutions should have access in general.

And while industry lobbyists push for consumer protections and more, adoption remains untouched in the data space unless users are prompted with no other option.

While we can see some options exist that provide immediate marginal gains in privacy, their adoption remains relatively stunted. Furthermore, we can look towards better privacy-preserving products such as Signal, the encrypted messaging platform, which has flatlined around 40 million users in the last couple of years. Or Brave browser which has seen also seen growth flatline.

The most well-adopted measure to date is likely the VPN market. Estimated at around $44 billion in 2022 and growing at 17% a year, would seem a considerable amount of uptake, and yet the average buyer makes the leap to paying for a VPN only partially for its privacy-preserving characteristics, but rather its commercial utility (such as watching Netflix shows by changing countries).

Give the people what they want

Perhaps one of the most interesting, and often overlooked aspects of this, is the lack of competitive alternative narratives against the strawman argument above. Privacy Maximalists often espouse the ideals of libertarian freedom and aside from the abstract idea of PrivacyTM, they fail to provide a compelling counter-narrative.

Consequently, any added friction, particularly in the face of say governmental authority, is simply not worth it to the majority especially if they are in fact doing nothing wrong. While people can easily get behind the idea of having some personal privacy, they do not have a strong desire to be subversive in the eyes of their government.

But this is not the only added friction, in fact, to put this point further, I believe that privacy in a lot of ways is at odds with good experiences. This is not an online phenomenon. Think about the times when you experienced some kind of luxury white glove services, the experience is optimized around the forfeiture of personal information.

What juices do you like most in the morning? How do you like your room temperature when you head to bed? What type of cooking do you like? How do you like your steak done?

This forfeiture of personal information enhances the experience and increases relevance. The more data revealed, the greater the experience is (closer to your desired optimal experience). I posit that because the feedback loop is more tactile, people don’t view this under the same lens as say a digital experience. Ultimately this hits on a more important point: Grasping the relative concerns about digital privacy/sovereignty is not just less tactile, it's also much less intuitive. How do we share relevant information without destroying privacy over other pieces?

Human beings happen to be rather bad at thinking in a Bayesian way. In fact, the average person probably does not even know what Bayesian. Understanding the implications of privacy-focused decisions in your online presence is an exercise in understanding fat tails which is an additional level of complexity over thinking in a probabilistic manner. And while this is not an advocate for Bayesian reasoning, most privacy maximalists will focus on these types of events, while neglecting to realize ramifications simply do not resonate for the average person - for mass adoption.

Understanding these issues, across data privacy is a highly technical issue. As BEW from Yunt Capital pointed out to me, those typically focused on privacy are usually above average with their technical understandings (think 2 standard deviations) and also financially secure. What does this mean?

Privacy is a higher-order good. Kind of.

For the average person struggling to fulfill their basic needs, this “product” gets pushed down the priority list. I notice in this industry we often forget to step out of our bubble and ground ourselves in the context of “the normie”. How would they look at the products and solutions you are building? The pejorative normie is the ultimate customer/user base, the ultimate bag buyer, for your products/assets, but for them to do so, it must make sense for them. This does not mean it is expensive to have privacy, this is a half-truth. Certain forms of privacy are quite inexpensive.

So this is not the whole story.

Where privacy sits in the hierarchy of needs is highly dependent on additional context. What kind of state do you live in? In the West it is a higher-order good, freedom of expression has generally been high and wrongful institutional persecution relatively low, and as a result, the need for private thoughts can be relatively undervalued. For one living through a socialist regime, for example, privacy is a matter of life or death depending on the subversiveness of the thought.

To simplify: external factors dictate this positioning. More specifically, how to evaluate the threat of violence from individuals vs. institutions will dictate the relative importance. And while history has taught us we should likely fear institutions much more than the individual, we hope that the directional arrow of the progress of society will push us towards a base level of society that we can shift our focus towards that of the individual. But this is what makes the conversation around privacy so murky - it's a difficult web to untangle.

Flipping back to online experiences, allowing data collection on yourself (more posts you like, your search and browsing history, etc.) improves curation, and allows for the delivery of more relevant ads (things you might actually buy). When we think about the alternative it is a bit of spray and pray.

Given that our starting point is a high amount of data already being out there, we are forced to reconcile that a change towards privacy is actually a worsening of the user experience.

So what are the alternatives?

Is this really the answer?

New methods of content curation that could provide a similar level of experience are needed in the face of this potential trade-off.

More often than not, the big feature unlocked described in Web 3 to solve this is owning your data. This is basically the idea that instead of Google owning all this data they collected on you, instead, you own and permission who sees and uses this. And while this idea makes the libertarians in the room feel warm and fuzzy, the reality of this is the added friction (active management of their data, potentially less relevant content) does not overcome the relative economic gain these users see (fees for being able to see the data). Even if such a solution was widely adopted it is easy to imagine 90%+ of users would just default to everybody selecting the maximum monetization setting and at the end of the day they (big corporate boogie men) end up with the same information. The relative order of operations is important here.

In Web 3 public financial transactions extend this beyond contextualizing behavioural and characteristic data. This adds a higher level of social anxiety previous systems did not have. As a society, we already struggle with our relationship with wealth, self-worth and societal status. Going back to our normies it is clear they need a solution to this.

While private blockchains exist, it’s pretty clear the entire structure of existing society will not let these fly and some sort of balance is needed. While people can dream of anarchist fantasies about Monero and co., the harsh reality is that these will probably never be widely adopted and will continue to be excluded from any real mainstream services as long as today’s governance systems continue to exist. This is because it includes (mostly) universally accepted intolerable externalities (enabling bad guys to do bad things).

Do you honestly think the idea of governments is going anywhere?

Their existence lies on the extreme end of the privacy spectrum and is at odds with the continued existence of functioning governments. This is not an explicit support of a highly paternalistic state dictating terms. What this is, is a point about acknowledging there is some level of behaviour that we can deem as universally unacceptable in any functioning kind of organized society. As a result, widely adopted privacy solutions MUST be able to work around this. How we moderate this base level acceptable behaviour is a question for later. For now, we can agree, stopping money launderers, thieves, and fraudsters is a good thing, and we should hope to be able to do this in any advanced society.

This is not “compliant” privacy, this is privacy for the good guys. Its establishment of society.

So as we seek to construct a model for what a good privacy solution looks like we can see we must balance three things:

  1. A desire to minimize social anxiety
  2. A desire to be in general alignment with their government/society
  3. A UX that is not inherently worse than the alternative

What could come from this one day is something far more interesting. If one felt they are not being watched, how does their behaviour change? If we create a sufficient level of comfortable privacy, can we increase the relevance of data? Failures of imagination plague privacy solutions, privacy is not the product, it is a feature.

Modern Solutions to Modern Problems

The problem we identified then is that often privacy solutions are at odds with both the consumer experience (and at this point, the expectation of experience) and the overarching desire to live in an organized system (a good citizen in the personal view of their governmental universalist model).

What paths do we have then to solve this? Some people might hate this, but what this really means is the possibility and inevitability of compliant rules-based privacy solutions. There is obviously variance in the degree of rule creation, but as we discussed earlier, some level is needed. Thus it is time to reframe the privacy narrative. While the methods are used for compliance with the paternalistic state currently, these same systems can also bring our own organization to the badlands. These same tools of compliance allow us to create laws to our decentralized society about what behaviour is and is not acceptable.

In Part 1 of our ZK series, we mentioned Private Computation as one of the key use cases of the technology. This concept is fairly easy to grasp. The most infamous example is perhaps Alibaba's cave. Thinking about our model of privacy, we can see how these characteristics of ZK can help resolve the paradoxical conditions we set out in designing a good (commercially viable) privacy solution.

These observations can help us posit that pseudonymity is a great middle ground, where we receive the purported benefits, while also achieving the fundamental objectives around social anxiety and a broader ability to prevent intolerable behaviour (who doesn't want to stop the bad guys when we can?). And to do this, zero-knowledge proofs will sit at the centre of all of it.

As a result, we can explore what a solution in this area might look like.

Silent Protocol ticks all the boxes of what we think a successful massively adopted privacy protocol will look like. It facilitates an enhanced level of pseudonymity, while also offering the ability to be preventative in the face of intolerable behaviour such as stopping money launderers and thieves. The team at Silent has an extremely clever design that allows users to anonymously tap into any application on mainnet in a private way.

A simplistic model to view this is as, if Tornado Cash was BTC, Silent Protocol is Ethereum, bringing an ability to use applications on mainnet such as Aave. The unified front end offers access to mainnet dapps in a single location. The clever design means new dapps can be integrated into short timelines (such as a day or two) and bring a hub of useful applications, tapping into existing pools of liquidity without the need of batching transactions like other privacy solutions (Aztec) offering similar feature sets.

The recent “counter hack” by Oasis (Maker front end) in accordance with law enforcement is the perfect exemplification of how things are likely to continue to develop. And while many cry about “but muh decentralized permissionless protocols!!!” in general it is a step in the right direction. To tie this back to our earlier points, normies do not want to interact with things where their assets can be stolen and be gone forever with zero recourse. This is simply a non-starter, and until there are robust tools around preventing such events, their participation will remain minimal. Naturally, the question becomes, how do you manage the level of compliance you are providing (i.e. checking the unending reach desires of the state)? Fortunately for permissionless systems, we can construct governance systems to manage these requests and as a result those behaviours we deem universally unacceptable.

Radical Solutions to Modern Problems

Rooted deeply in founding crypto and cypherpunk culture, there is an alternative path. While I have spoken confidently on the reasons why I think privacy solutions have failed in the past and what I surmise the future to look like, I cannot discuss the subject comprehensively without acknowledging the other path before us.

Perhaps more relevant is to address the overarching philosophical question of whether the state has the right to intercede in the first place.

Darkfi is a layer one leveraging ZK, multiparty computation, and Homomorphic encryption to bring the highest levels of anonymity. This is both across usage, and engineering. For the uninitiated, homomorphic encryption is basically an encryption technique that allows for computation/operations on encrypted data. And while this is relatively new - especially in practical application - it will undoubtedly be an interesting experimental field where new primitives are formed. Historical cypherpunk narratives around privacy have focused on the negative. For example, freedom “from” coercion and violence, Darkfi has instead chosen to focus on the creation of new worlds and primitives in this environment including societies with their own sets of moral guidelines. Choosing optimistic narratives

To go a step further, we ultimately need both systems of privacy running in parallel as a method to ensure the creation of novel heterodox ideas and offer the escape hatch from institutional violence. These are often the areas where we can expect to reap the benefits of the non-linearity of complex systems. Such environments are necessary for the continued innovation we depend on. This is not to say innovation does not happen in the other stream as well. This is rather a means to ensuring the collective “we” maintain exposure to both styles of innovation - radical and iterative. While a compliant solution focuses on the reduction of the risk of individual violence, the non-compliant stream ensures there exists an alternative in the event of institutional violence. They are the escape hatch for the intolerable tail event by the institution.

Zero Knowledge Privacy

While they are not the most loved ideas within the crypto-native community, I am optimistic that rules-based privacy solutions will actually lead to increases in aggregate privacy being experienced, increased sovereignty, and simultaneously, a reduction in untenable behavior. ZK technology undoubtedly sits at the heart of this optimism, leveraging its privacy-preserving characteristics to unshackle us from the confines of paradoxical structural dilemmas between privacy and information. Revealing information when necessary to reduce bad behavior and bring order.

At present, we struggle to find ways to straddle the edge of the red dotted box and the rest of the idea space, as it opens up many more vectors of consideration. Most of the market thus ends up bifurcated across the boundaries of what effectively amounts to all or nothing on the privacy front. This all-or-nothing mentality misses the key value proposition compliant solutions might have, however, one must distinguish between society and state. One can make a compliant solution to the moral standards of their society that is not compliant with the state and vice versa. Perhaps a good early example of this is the way the silk road imposed certain moral rules/standards despite being a largely lawless agorist platform. The value here is being able to set moral standards of conduct.

Furthermore, by protecting and controlling one’s data we can move beyond the current models of contextualization on the web and create privacy. Current methods construct a model of you from the data you forfeit browsing, searching, and offering up personal information on the web. Through this, our Web 2 giants are able to construct a relatively clear picture of you, often with precise data points. Instead, leveraging our newfound fire-making tools, we can shift towards the zk-query space.

In this space, controlling this sensitive information, these same context seekers construct a different image of one’s self from deterministic to probabilistic. An image based on queries, but not precise information. They are left to fill in the space you are not, creating the same context of who you are, without ever knowing exactly who you truly are. And while infinitely smaller range-bound queries can enhance the resolution of the boundary line at a given point, infinite broader queries are needed to get the whole picture of everything about you. The infinite possibilities of the shape of you are what makes this so powerful, while the ability to have a check on these universal beliefs makes the aggregate experience better.

Conclusion

So no, privacy has not failed. Imagination has failed to create captivating, marketable products. Privacy is not the product, but a feature improving a more extensible offering. The zero-knowledge engineering space opens up new realms of privacy-enhancing product design. Truly private financial platforms fail because they enable unacceptable behavior, while private data solutions fail to create products that facilitate data capture in a competitive way.

In Web 2 and the early days of crypto, these streams of development basically diverged. The cypherpunk movement can trace its roots to these fears of institutional violence. Perhaps as a knee-jerk reaction to the events unfolding around them IRL, the extreme of ungovernable environments was born. While these environments paved the way for demonstrating what is possible with this technology, we have seen through the current phase of crypto, the streams of compliance and noncompliance have existed in parallel.

Much like the wild west being first explored and broadly lawless, eventually settlers pushed ever more west on the Oregon Trail, bringing with them increasing order over the chaos. I believe zero knowledge tech leads back towards a convergence of the streams. What was once the fringe is now being settled and order must be restored for masses to begin their inhabitation. This idea of compliant solutions elicits an almost allergic response from the crypto-native crowd, but I do believe these kinds of products will have a better impact than most give them credit for. What they really bring is laws to the recently settled.

As I said previously, I believe these kinds of products will actually increase the aggregate level of privacy experienced by everyone without becoming “narcware”. Ameen’s Privacy Pools development is a testament to this. But this extends beyond just financial use cases. Queryable zk data solutions can similarly prevent unacceptable forms of other data from being present whilst increasing privacy.

Order to the chaos. Chaos to order.

And while these settlers institutionalize the once wild lands of the frontier, the pioneers will continue to push forward through new paradigms such as dark engineering.

Getting more privacy AND stopping the bad guys? That's something worth believing in to me, but it is only the start. If you have some ideas - DMs are open.

Resources

https://www.brookings.edu/research/why-protecting-privacy-is-a-losing-game-today-and-how-to-change-the-game/

https://www.nytimes.com/2021/09/16/technology/digital-privacy.html

https://www.brown.edu/Departments/Joukowsky_Institute/courses/13things/7121.html

https://fs.blog/the-panopticon-effect/

https://plato.stanford.edu/entries/citizenship/#UnivVsDiffConcCiti

https://robkhenderson.substack.com/p/status-symbols-and-the-struggle-for

Subscribe