查看原文
其他

Privacy in the Age of Big Data: Orwellian vs. Kafkaesque

Daniel J. Solove 城读 2022-07-13

340

Privacy in the Age of Big Data: Orwellian vs. Kafkaesque


The privacy problems are not just Orwellian, but Kafkaesque.

Daniel J. Solove, 2007. 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy. San Diego Law Review, Vol. 44, p. 745, GWU Law School Public Law Research Paper No. 289, Available at SSRN: https://ssrn.com/abstract=998565
Daniel J. Solove, 2011. Nothing To Hide: The False Tradeoff Between Privacy and Security, Yale University Press.

Source: https://www.danielsolove.com/nothing-to-hide/


I translated "Smart Enough Cities" into Chinese and it was published last year. Then I was invited by The Thinker (a weekly magazine run by CITIC Press Group) to write an essay, "From the Eyes of the Street to the Eyes of the City: How Smart Cities have Changed Anonymity?” (Reprinted with permission in this issue of CityReader, see the second tweet) to sort out my preliminary thoughts on cities, technology, and anonymity in the process of translation and learning. The following is a clear, sharp, and inspiring discussion of privacy by Daniel J. Solove, a law professor and privacy law expert at Georgetown University, one of the scholars who influenced the writing of Smart Enough Cities.
 
Many commentators had been using the metaphor of George Orwell’s 1984 to describe the problems created by the collection and use of personal data. The Orwell metaphor, which focuses on the harms of surveillance (such as inhibition and social control) might be apt to describe law enforcement's monitoring of citizens. By observing the most intimate details of everyone's lives and punishing even the slightest dissent, Big Brother controls the society's behavior. Following Orwell's influence, people typically conceive of privacy following the "secrecy paradigm": the idea that privacy is invaded when one's secrets are observed or exposed, leading people to self-censor (via "chilling effects") or suffer the consequences.
 
But the secrecy paradigm fails  to  explain  the  harms  of  someone's  bike share  trips or WeChat moments Likes being collected, aggregated, and analyzed. Orwellian privacy hardly encompasses the other privacy issues that have sprung up in the era of big data. Daniel J. Solove presents a Kafkaesque privacy issue and proposes a privacy typology in "Nothing to Hide: The False Trade-Off Between Privacy and Security," borrowing the theme from Franz Kafka's 1925 novel "The Trial".


The book's protagonist, Josef K., wakes up on his thirtieth birthday to find two men in his room declaring that he is under arrest. He is given no indication of what he has done or what agency is arresting him. The novel depicts Josef's unsuccessful attempts to uncover the identity of the mysterious court and what data it possesses about him. He is murdered by the court's agents on his thirty-first birthday without ever having learned their true nature. "The Trial captures an individual's sense of helplessness, frustration, and vulnerability when a large bureaucratic organization has control over a vast dossier of details about one's life,"Solove explains. Solove likens much of today's collection and use of data to the themes of Franz Kafka's The Trial. Just like Josef, people today have little knowledge of or control over what personal data is collected, who owns it, and how they exploit it.
 
Franz Kafka's The Trial, which depicts a bureaucracy with inscrutable purposes that uses people's information to make important decisions about them, yet denies the people the ability to participate in how their information is used. The problems captured by the Kafka metaphor are of a different sort than the problems caused by surveillance. They often do not result in inhibition or chilling. Instead, they are problems of information processing—the storage, use, or analysis of data—rather than information collection. Legal and policy solutions were focusing too much on the nexus of problems under the Orwell metaphor—those of surveillance—and were not adequately addressing the Kafka problems— those of information processing. They affect the power relationships between people and the institutions of the modern state. They not only frustrate the individual by creating a sense of helplessness and powerlessness, but they also affect social structure by altering the kind of relationships people have with the institutions that make important decisions about their lives.
 
A Pluralistic Conception of Privacy

What exactly is "privacy"? How valuable is privacy  and how do we assess its value? How do we weigh privacy against countervailing values?
 
Solove argues that privacy is not reducible to a singular essence; it is a plurality of different things that do not share one element in common but that nevertheless bear a resemblance to each other.
 
Solove develops a taxonomy of privacy—a way of mapping out the manifold types of problems and harms that constitute privacy violations.
 
Table A taxonomy of privacy by Daniel J. Solove 

Categories

Subcategories
Information Collection

Surveillance

Interrogation

Information processing

Aggregation

Identification

Insecurity

Secondary Use

Exclusion

Information Dissemination

Breach of Confidentiality

Disclosure

Exposure

Increased Accessibility

Blackmail

Appropriation

Distortion

Invasion

Intrusion

Decisional Interference

 
The taxonomy has four general categories of privacy problems with sixteen different subcategories. The first general category is information collection, which involves the ways that data is gathered about people. The subcategories, surveillance and interrogation, represent the two primary problematic ways of gathering information. A privacy problem occurs when an activity by a person, business, or government entity creates harm by disrupting valuable activities of others. These harms need not be physical or emotional; they can occur by chilling socially beneficial behavior (for example, free speech and association) or by leading to power imbalances that adversely affect social structure (for example, excessive executive power).
 
The second general category is information processing. This involves the storing, analysis, and manipulation of data. There are a number of problems that information processing can cause, and I included five subcategories in my taxonomy. For example, one problem that I label insecurity results in increasing people's vulnerability to potential abuse of their information. The problem that I call exclusion involves people’s inability to access and have any say in the way their data is used.
 
Information dissemination is the third general category. Disseminating information involves the ways in which it is transferred—or threatened to be transferred—to others. I identify seven different information dissemination problems.

Finally, the last category involves invasions. Invasions are direct interferences with the individual, such as intruding into her life or regulating the kinds of decisions she can make about her life.
 
The Social Value of Privacy

Many theories of privacy view it as an individual right. For example, Thomas Emerson declares that privacy "is based upon premises of individualism, that the society exists to promote the worth and the dignity of the individual. . . . The right of privacy . . . is essentially the right not to participate in the collective life—the right to shut out the community."
 
The value of protecting the individual is a social one. Society involves a great deal of friction, and we are constantly clashing with each other. Part of what makes a society a good place in which to live is the extent to which it allows people freedom from the intrusiveness of others. A society without privacy protection would be suffocating, and it might not be a place in which most would want to live.
 
Privacy constitutes a society's attempt to promote rules of behavior, decorum, and civility. Society protects privacy as a means of enforcing a kind of order in the community. Privacy, then, is not the trumpeting of the individual against society’s interests, but the protection of the individual based on society's own norms and values. Privacy is not simply a way to extricate individuals from social control, as it is itself a form of social control that emerges from a society's norms. It is not an external restraint on society, but is in fact an internal dimension of society. Therefore, privacy has a social value. Even when it protects the individual, it  does so for the sake of society. It thus should not be weighed as an individual right against the greater social good. Privacy issues involve balancing societal interests on both sides of the scale.
 
There is a social value in protecting against each problem, and that value differs depending upon the nature of each problem.
 
The problem with the "Nothing to hide" argument
 
When discussing whether government surveillance and data mining pose a threat to privacy, many people respond that they have nothing to hide. This argument permeates the popular discourse about privacy and security issues. In Britain, for example, the government has installed millions of public surveillance cameras in cities and towns, which are watched by officials via closed circuit television. In a campaign slogan for the program, the government declares: "If you've got nothing to  hide, you've got nothing to fear."
 
The "nothing to hide" argument and its variants are quite prevalent in popular discourse about privacy. Data security expert Bruce Schneier calls it the "most common retort against privacy advocates". Legal scholar Geoffrey Stone refers to it as "all-too-common refrain". The nothing to hide argument is one of the primary arguments made when balancing privacy against security. In its most compelling form, it is an argument that the privacy interest is generally minimal to trivial, thus making the balance against security concerns a foreordained victory for security. Sometimes the nothing to hide argument is posed as a question: "If you have nothing to hide, then what do you have to fear?"Others ask: "If you aren't doing anything wrong, then what do you have to hide?"
 
The reasoning of this argument is that when it comes to government surveillance or use of personal data, there is no privacy violation if a person has nothing sensitive, embarrassing, or illegal to conceal. Criminals involved in illicit activities have something to fear, but for the vast majority of people, their activities are not illegal or embarrassing.
 
But the problem with the nothing to hide argument is the underlying assumption that privacy is about hiding bad things. Agreeing with this assumption concedes far too much ground and leads to an unproductive discussion of information people would likely want or not want to hide. As Bruce Schneier aptly notes, the nothing to hide argument stems from a faulty "premise that privacy is about hiding a wrong".
 
The deeper problem with the nothing to hide argument is that it myopically views privacy as a form of concealment or secrecy. But understanding privacy as a plurality of related problems demonstrates that concealment of bad things is just one among many problems caused by government programs such as the NSA surveillance and data mining.
 
The privacy problems are not just Orwellian, but Kafkaesque. The surveillance programs are problematic even if no information people want to hide is uncovered. In The Trial, the problem is not inhibited behavior, but rather a suffocating powerlessness and vulnerability created by the court system’s use of personal data and its exclusion of the protagonist from having any knowledge or participation in the process.
 
One such harm, for example, aggregation, emerges from the combination of small bits of seemingly innocuous data. Aggregation, however, means that by combining pieces of information we might not care to conceal, the government can glean information about us that we might really want to conceal. Part of the allure of data mining for the government is its ability to reveal a lot about our personalities and activities by sophisticated means of analyzing data.  Therefore, without greater transparency in data mining, it is hard to claim that data mining program will not reveal information people might want to hide, as we do not know precisely what is revealed. Moreover, data mining aims to be predictive of behavior, striving to prognosticate about our future actions. People who match certain profiles are deemed likely to engage in a similar pattern of behavior. It is quite difficult to refute actions that one has not yet done. Having nothing to hide will not always dispel predictions of future activity.
 
Exclusion is the problem caused when people are prevented from having knowledge about how their information is being used, as well as barred from being able to access and correct errors in that data. This kind of information processing, which forbids people’s knowledge or involvement, resembles in some ways a kind of due process problem. This issue is not about whether the information gathered is something people want to hide, but rather about the power and the structure of government.
 
A related problem involves "secondary use." Secondary use is the use of data obtained for one purpose for a different unrelated purpose without the person's consent. The Administration has said little about how long the data will be stored, how it will be used, and what it could be used for in the future. The potential future uses of any piece of personal information are vast, and without limits or accountability on how that information is used, it is hard for people to assess the dangers of the data being in the government’s control.
 
Therefore, the problem with the nothing to hide argument is that it focuses on just one or two particular kinds of privacy problems—the disclosure of personal information or surveillance—and not others. The nothing  to hide argument represents a singular and narrow way of conceiving of privacy, for it forces the debate to focus on its narrow understanding of privacy. It assumes a particular view about what privacy entails, and it sets the terms for debate in a manner that is often unproductive.
 
The key misunderstanding is that the nothing to hide argument views privacy in a particular way—as a form of secrecy, as the right to hide things. But there are many other types of harm involved beyond exposing one's secrets to the government. In many instances, privacy is threatened not by singular egregious acts, but by a slow series of relatively minor acts which gradually begin to add up.
 
When balancing privacy against security, the privacy harms are often characterized in terms of injuries to the individual, and the interest in security is often characterized in a more broad societal way. The security interest should not get weighed in its totality against the privacy interest. Rather, what should get weighed is the extent of marginal limitation on the effectiveness of a government information gathering or data mining program by imposing judicial oversight and minimization procedures.

Related CityReads

47.CityReads│Cities and Ideas: Bigger Is Better?73.CityReads│What Technologies Can Do for the Future of Cities?77.CityReads│Four Keys to the City94.CityReads│History of Tomorrow: Who Will Become the Homo Deus?97.CityReads│Alone Together123.CityReads│How to Escape the Progress Traps?127.CityReads│Everybody Lies: How the Internet Reveals Who We Are147.CityReads│Can Cities Help Us Hack Formal Power Systems?
177.CityReads│New Vocabulary to Understand the Urbanization Process180.CityReads│Castells on The Rise Of Social Network Sites190.CityReads│San Francisco Bay Area: Beyond the Tech and Prosperity223.CityReads│Do Cities Become Obsolete Under Globalization & ICTs?

250.CityReads│This Book Will Change How You View Globalization

257.CityReads│6 books on Global Cultural Understanding

258.CityReads│Why Checking Likes Is the New Smoking?
265.CityReads│The Paradox of Growth267.CityReads│Rise of the Platform Society295.CityReads | How the Innovation Complex Has Changed Our Cities?296.CityReads | Platform Capitalism’s Hidden Abode311.CityReads | 8 Books to Read in Uncertain Times313.CityReads | 6 Insightful Books on Smart Cities319.CityReads | A world without work320.CityReads | The Smart Enough City324.CityReads | How Should City Use Data for Public Good?325.CityReads | 6 Books on Future Cities327.CityReads | Social Media as the Hype Machine(Click the title or enter our WeChat menu and reply number 
CityReads Notes On Cities

"CityReads", a subscription account on WeChat, 

posts our notes on city reads weekly. 

Please follow us by searching "CityReads" 

您可能也对以下帖子感兴趣

文章有问题?点此查看未经处理的缓存