Hold the Ethics: Surveillance, Data Mining and the Destruction of Personal Privacy

Tyler Huggins

"Why, tell me, why?

I wanna reach you with this binary mind

Cause if I do I'm sure that we'll be complete" -Ra Ra Riot

 

A mid-2000 Klondike bar campaign poses this question: "What would you do for a Klondike bar?" The answers run a predictably banal gamut: a fatuous act of self-imposed embarrassment, a singular performance of suspended inhibitions. The commercial follows this template: I'll offer a glimpse of my private self (or, temporarily deactivate my defense mechanisms and self-consciousness) for a Klondike bar, but I'm unwilling to concede anything more penetrating (i.e. exploitation of my failing relationship, my arachnophobia, my mounting debts) for public display.

 

Each actor willingly offers a tiny slice of their private consciousness for the ice cream treat, relinquishing very little for very little. Today, the Klondike exchange represents a nominal sacrifice. The exchange of very little for very little has bloated to colossal and disturbingly so-real-it's-surreal proportions. We no longer sacrifice a superfluous glimpse into our personal lives for a measly Klondike, we offer everything and all for the newest and trendiest anything, and sometimes without knowledge of the transaction. The rupture between our personal and public self, hemorrhages toward complete elimination of personal privacy. If the commodification of our private lives (or, private data) is the currency for today's transactions, are we receiving appropriate compensation for the keys to our diary?

 

Futurism --> Transhumanism

 

As technological fervor took root in Western culture during the late 19th/early 20th centuries, the future of the human population underwent a physical and philosophical morph. Industries adapted to welcome the means of mechanized production, commodity output accelerated, and labor became displaced by the quicker, cheaper assembly line. Typical to periods of industrialization, the populace lacked the terminology to comprehend what Vaclav Smil referred to as "The Age of Synergy" (or, as Hollywood might say: “When Science Met Production!”). As the population furtively shifted to accommodate internal combustion engines and radios, a cultural movement formed in Italy, dubbed Futurism by its progeny.

 

In homage to the new era, Futurists walked "in step with the progress of the machine, of aircraft, of industry, of trade, of the sciences, of electricity." (The Futurist Manifesto, Filippo Marinetti). Futurism embraced the speed and violence of mechanical energy, and the youth and vitality it proffers (unsurprisingly, many futurists rallied around fascism). As the Futurists' devotion to industrialization adjoined and eventually meshed with the prevailing Western culture, human life became inextricably linked with technology.

 

Futurism re-entered the cultural canon thanks largely to the work of Alvin Toffler. Toffler observed the dawning of the super-industrial society and unceremoniously hitched the new epoch of industrialization with overstimulation in his seminal piece, Future Shock. Industrialization, to Toffler, signified "too much change in too short a period of time." This phenomenon continues to resonate with humanity; apps, updates and innovations define our day-to-day. The consequence of too much change in too short a period of time? Information overload, said Toffler. Information overload registers particularly familiar in the 21st century, where every day is a world's expo.

 

As Futurism waned from avant garde to normality and subsequently, irrelevance, transhumanism gained traction.

 

A quick summary on transhumanism for the unacquainted:

 

Nietzsche’sÜbermensch plays a significant role in transhumanism, as does J.B.S. Haldane, geneticist and author of the monograph: Daedalus or Science and Future in which Haldane posited that new inventions will be considered "blasphemous" and "perverse." Transhumanism transcends salivating over unsexy cars or trains like futurists; it's artificial intelligence that makes their toes curl like venetian blinds.

 

Contemporary transhumanists you may recognize:

 

Raymond Kurzweil (probably the most well-known transhumanist, more on him later), Hans Moravec (robot philosopher - well,philosopher of the evolution of robots, not an actual robot). Anyone affiliated with the NGO. World Transhumanist Association, now known as Humanity+. Humanity+'s definition of transhumanism: The study of the ramifications, promises, and potential dangers of technologies that will enable us to overcome fundamental human limitations, and the related study of the ethical matters involved in developing and using such technologies. Humanity+ produces a transhumanist magazine, H+.

 

Futurism <--> Transhumanism

 

In this 21st century, the term futurist/m receives postmodernism's highest honor of hackneyed trendiness, where the once emergent aesthetic is scrubbed clean of its origins and tidied up for creatives in black V-necks to throw on their Linkedin resumes. Futurism and Transhumanism co-exist in the modern epoch, but Transhumanism entertains serious academic luridness, while futurism exists chiefly as a neo-corporate buzz word. Kurzweil, one of the more renowned academics associated with Transhumanism (he's the director of engineering at Google, a prolific inventor and a regular futurist/transhumanist mouthpiece), is rabidly enthusiastic (despite his monotone delivery) about the evolutionary transition into transhuman culture, and acts as one of the most clamorous harbingers of the tipping point into super intelligence. This watershed shift in the not-so-distant future is known as singularity.

 

Enter Singularity or How I Stopped Worrying and Learned to Love the Cyborg

 

When singularity will tip the scales and change everything remains completely unpredictable, despite what your technophile roommate preaches as he jailbreaks your iPhone. Some say 40 years from now, others say 30, and the more authoritative offer a range between five and 100. Paul Allen speculates singularity won't occur within this century (this seems important, somehow). When singularity will take place is indeterminable, but trust your local technophile, it will occur.

 

What is singularity? Technological singularity is the moment human technology creates a super intelligence, representing a tipping point for human intelligence to make way for artificial intelligence. For some, this marks the beginning of the book of Revelations, for others, the dawning of a new species and the humankind's crowning achievement. 

 

Return to Haldane's quote (full length version): "The chemical or physical inventor is always a Prometheus. There is no great invention, from fire to flying, which has not been hailed as an insult to some god. But if every physical and chemical invention is a blasphemy, every biological invention is a perversion." Like the hindsight condemnation of a great invention as blasphemous or perverse, there's a tendency to deny and repress the coming of imminent creation, by the creator or the prescient. In response to the imminent singularity, a staggering amount of the population will (to borrow a loathsome phrase) call shenanigans and resume reading Blue Like Jazz, but the evidence offered in favor of singularity is quite staggering.

 

Raymond Kurzweil, who has already begun the countdown to singularity with red X's on his Google calendar, famously extrapolates to the exact moment of singularity through an adaptation of Moore's law, or the Law of Accelerating Returns, which demonstrates an exponential rise of technological progress. From the man himself: "We won’t experience 100 years of progress in the 21st century—it will be more like 20,000 years of progress (at today’s rate). The 'returns,' such as chip speed and cost-effectiveness, also increase exponentially. There’s even exponential growth in the rate of exponential growth. Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity." Kurzweil estimates 2045 as the year of Singularity, and according to his calculations, we're right on track.

As humanity endeavors ever closer to "The Singularity," a negative correlation emerges. Each invention, each progress of technology indirectly influences a loss of the collective private consciousness.

 

Shut Up and Take My Privacy!

 

How humanity lost the private space isn't a tale of struggle or resistance. We gave it willingly, blithely and with no thought to consequence.

 

During the nascent George W. Bush era, Vice President Cheney admitted to the public that American military intelligence will employ "dark side" methods (a.k.a. illegal and despicable methods of torture) to battle terrorism. Which begs the question, why admit to torture now? If Cheney felt comfortable advocating torture (which the Bush administration played ignorant to for at least five years afterward), what other inhumane perpetrations are left unsaid? To quote Slavoj Žižek: "Here we enter the domain of secret operations, of what power does without ever admitting it."

 

A few years prior (1999), Sun Microsystems then-CEO Scott McNealy infamously stated: "You have zero privacy. Get over it." Like Cheney's call for America to back torture, McNealy calls for Internet users to halt the war for Internet anonymity. McNealy's candor is simultaneously refreshing and threatening, but his dismissive attitude toward personal privacy is suspect. Was McNealy attempting to manipulate the public into lowering their defenses against corporate and government invasiveness, or is privacy actually nonexistent? Was it a proclamation or a certainty?

 

"Privacy, after all, encompasses much more than just control over a data trail, or even a set of data. It encompasses ideas of bodily and social autonomy, of self-determination, and of the ability to create zones of intimacy and inclusion that define and shape our relationships with each other. Control over personal information is a key aspect of some of these ideas of privacy, and is alien to none of them." -Michael Froomkin

 

 

Digital privacy found itself in corporate crosshairs since the web's inception. In 1994, the Washington Post outed America Online for selling subscribers' (around 1 million at the time) personal information to direct marketers without consent. Before and after America Online's gaffe, users suffered invasive incursions into their personal privacy. Beholden to avarice, the data economy was too profitable to question the ethics of privacy destruction. And so, the benefactors of dataveillance set about strip-mining Internet user privacy par tous les moyens nécessaires. Subsequently, user data is stored and sold to the highest bidder, be it corporation or government. Users received nothing in compensation.

 

"Ah!" McNealy and his contemporaries cry, "That's where you're wrong. Users have gained much in exchange for their privacy, they have the Internet. They have Facebook and Skype; open source software and the ability to quick-design infographics and minimalist renditions of kitsch Hollywood films. The user is in power!"

 

At what cost? What is our privacy worth?

 

The America Online scandal is now status quo. Every digital service and platform subsides off user data, and marketing agencies encourage data mining to drive e-consumption. For our part, we depend on these companies to sate our desire to connect and create. And so, we turn over our personal data to the faceless digi-corporations in exchange for programs that extract pertinent personal info, commoditize it and sell it back to us with some varnish and packaging from Amazon or eBay.The result? This is us losing ourselves.

 

Not-So-Anonymous

 

As 9/11 jingoism ebbed and the Bush administration initiated the maligned war on terror, the Patriot Act took the future of citizens and their personal privacy and placed it at the mercy of a particularly invasive government. The act withstood excoriation from civil rights activists and leftist media for the past 11-plus years, and in spite of its spurious title and First and Fourth Amendment infringements, the Act metastasized into a reliable trump card for government security branches (specifically the NSA and Dept. of Homeland Security) to flash after direct attacks on American privacy.

 

The ensuing invasions on personal privacy in the name of the Act are well-catalogued (NSLs and the gag orders, circumventing the Fourth Amendment through sneak and peeks), and the transgressions are only accumulating as hyper-surveillance normalizes: the Dept. of Homeland security's biometric database, the National Security Agency's multibillion-dollar Fort Williams data center and so on. Balance, the founding principle of America's representative republic and safeguard against totalitarianism, is slipping, shifting impassively in favor of the government and their cronies. While the argument for government transparency wages on, our government (thanks to the Act's room for maneuvering among other succeeding clauses) is methodically amassing the facilities and power to render the populace completely transparent. And, conversely, through layers of bureaucracy and private contracts, our government becomes more opaque.

 

If, like Scott McNealy assures us, privacy is a figment of the past, how is it that our government is so inscrutable?

 

Criticism against the government intrusions into our personal spheres foments in the digital margins on blogs/news outlets or Revelations-obsessed podcasts. The mainstream response remains lethargic, reacting/responding to the intrusions with a "nothing-to-hide" mentality of indifference. This mentality pervades in the public, yet the significance of the statement is in its contradiction. If you truly have nothing to hide, why make the claim?

 

When public opinion is indifferent to personal invasion, we find ourselves on loose footing. "I have nothing to hide" betrays much more than our desire for secrecy - it's a disempowering concession. By saying, "I have nothing to hide" you condone a culture of investigating the innocent (Go ahead and search me, I have nothing to hide!). Search and seizure laws exist to promote the culture of innocent until proven guilty (You can't search me, I still remain innocent!), and the growing trope of "I have nothing to hide" deconstructs our assumed innocence into the new era of surveillance: all are guilty.

 

From Personal Privacy in an Information Society: The Report of the Privacy Protection Study Commission, as transmitted to Jimmy Carter in 1977:

           

The balance to be struck is an old one; it reflects the tension between individual liberty and social order. The sovereign needs information to maintain order; the individual needs to be able to protect his independence and autonomy should the sovereign overreach. The peculiarly American notions of legally limited government and the protections in the Bill of Rights provide broad theoretical standards for reaching a workable balance. But the world has a way of disrupting the particular balance struck in past generations; the theory may remain unaltered but circumstances change, requiring a reworking of the mechanisms which maintained the balance in the past.

 

 

The emerging information technology requires a serious "reworking of the mechanisms which maintained the balance," and the government and corporations are speaking for the individual, causing the balance to slip into the precarious realm of "unwarranted intrusions by government [and corporations] which, in John Adams' mind, provided the spark that ignited revolution."

 

Patriot Acts and Social Widgets

 

After the Boston bombings, surveys asked if Americans are willing to give up more of their civil liberties to ensure safety. The answer? "No."

 

American citizens operate largely in hindsight. We lampoon the Obama Administration's torture practices after offering our blessing to the Bush administration to use "dark side" measures. Of course, there were no weapons of mass destruction. Of course, the U.S. profits from occupying, destroying, and rebuilding countries. Such certainties surface long after the original deception, when the populace musters the courage to stand as one and declare "No!" when the damage is already done. As counter-terrorism measures are bolstered by the Boston Marathon deaths and injuries, the aggressive security measures that followed the attacks on September 11 are now non grata. In line with our torture and war-profiteering dissent, our cries against the destruction of civil liberties, especially in the form of dataveillance, are too late.

 

Like a nude photo, user data is impossible to retrieve from the digital realm. Every relinquished personal detail becomes a permanent data gene stored in the ether. Companies such as America Online and Google were quick to recognize the value of user data and cultivated applications intended to encourage users to hemorrhage their personal information. America Online quantified the data and sold it to direct marketing companies. Google used the data internally, personalizing their ad-space and making a killing off of their users’ personal inclinations. Other companies adopted similar models, and today, the data economy features thousands of companies offering digital consumer profiles for corporations or providing assessment of civilians for the government. To trade and compile user data more efficiently and fruitfully, the government-corporate barriers were pushed aside, allowing corporations and the feds to analyze Internet users with dual scrutinies. A user could feasibly purchase a mass-produced Quran one day, and be placed on the government's No-Fly list the following morning.

 

With the accumulation of consumer/civilian data in capitalist vogue, most new technologies are either devised with the data economy in mind (i.e., customer relationship management software) or manipulated to feed the economy's unslakable thirst for analyzable data (reddit, instagram). Technological innovations, no matter how ostensibly open for public use, will always be subject to this process.

 

It is important to note that those dealing in the buying and selling of data work through corporate or government mediums. The technology that propels us toward Singularity operates in the same realm. Even if Singularity occurs in some anonymous garage in the armpit of Kentucky, the achievement will file through corporate and government channels. Each institution will alter the intelligences to their specifications and only then will the new technologies be made available to the consumer. For corporations, surveillance delivers the consumer genotype, including buying habits and the products each consumer is apt to purchase. Consequently, corporations now enjoy unprecedented brand-consumer intimacy. Federales use surveillance to "keep the peace." Of course, peacekeeping is simultaneously a mediating and nefarious process, where maintaining public routine is a function of political agenda. And the dominant agenda, keeping the throne, bears little oversight.

 

Each power structure is immeasurably dependent on public disinterest in anonymity and this common interest has encouraged campaigns in favor of privacy destruction on both sides. The government's campaign acquired considerable territory during the post-9/11 security measures extravaganza, where America happily forked over personal rights to eliminate the faceless other in the war on terrorism. Corporate campaigns gained momentum as technology evolved and dazzled, trading slices of user privacy for the newest design program or social widget.

 

And Amazon Shaped Users in Its Own Image

 

The transformative power of technology is astounding. E-commerce, one of the principle beneficiaries of advancements in technology, gains considerable potency as the capacity to market and sell goods on the Web develops on the cutting edge. To better sell their wares, digital businesses use data-based technologies to strip and rebuild every consumer into a more receptive, more programmable participant. As a result of the reprogramming, when an online shopper makes their umpteenth Amazon purchase, they're already shopping more fluidly then when they began. The process of personalization, in the form of search personalization or personalized consumer profiles, catalyzes the transformation from casual consumer to guided buyer.

 

How this goes down (example: Amazon): Every search inquiry, purchase and review provides Amazon with valuable data. The data (when analyzed) spits out products statistically proven to be of interest to specific users. Although some occasional tweaks will transpire now and again, Amazon uses the data to sculpt a working digital profile of each consumer and caters to this projection. Users can either accept the proposed digital Projection or shop in the margins of this e-commerce behemoth.

 

The Internet, in this sense, has changed drastically. Now, by using the Internet, a user agrees to an unwritten contract requiring the person to betray sensitive personal information. Like the public sphere, the digital sphere went the way of the corporation, where our enjoyment of the Internet depends largely on the whim of monopolies and corporate influence. This isn't the age of the empowered consumer; it's the age of the well-trained buyer, receptive and conditioned. In this vein, the Projection never changes. An individual's cast is taken and remains the same ad infinitum. And Amazon shaped its users in its own image.

 

A Trap of Our Own Design

 

As more desirable products crowd the digital sphere, personal data proves difficult to quantify. Granted, some users aren't interested in the retention of privacy (the "I have nothing to hide" purview). For those unaware of privacy loss or even for the moderately concerned, the digital commodity always shines brighter than the value of personal privacy. However, determining value of an ostensibly complimentary service, such as social media, is incredibly difficult. This dilemma has users a bit psychologically scrambled. They're given a free service, but they suspect (correctly) that free is a misleading descriptor (it's the same doubt associated with a free cookie from a younger or older sibling where the immediate response is suspicion: "What's in it?"). Conversely, there's no pricing sheet for our personal data and if there is, it's unavailable to the public.

 

And so we use the free service, relinquishing slices of our data (or large swaths, depending on the service in question) and suppress our doubts. Michael Froomkin refers to this phenomenon as the privacy myopia: "Consumers suffer from privacy myopia: They will sell their data too often and too cheaply. Consumer privacy myopia suggests that even Americans who place a high value on information privacy will sell their privacy bit by bit for frequent-flyer miles." In contrast, if companies attempt to take away privacy in a one-hand-swipe (Instagram, SOPA, CISPA, Facebook), ardent Web users go bonkers. Why?

 

We're more comfortable dismantling our private sphere bit by bit than letting corporations throw wrecking balls into our fortress of solitude. Psychologically, this makes sense. It's how relationships work — we relinquish bits and pieces and throw down our baggage only when we feel safe to do so. But a relationship with a digital program, an app, or an e-device? Demonstrating amor for a digital object is Hollywood's favorite form of foolishness, yet we can't help but defy ourselves in spite of our own logic to the contrary. We know using an automaton as a confidant represents a personal flaw (perhaps our inability to confide in ourselves?), but we fear self-judgment or peer judgment. And automatons don't judge. Ultimately, we realize that indulging in ill-advised technophilia is much more satisfying then admitting to our own need for physical comfort.

 

Questioning the motives for our digital affair won't shed any light on the phenomenon of privacy destruction. There are too many variables at play (atomization as a result of technology, the fickle nature of human relationships, social psychoses, and so on). What we can (and should) do is analyze how we are courting the machines that may one day forge with our own intelligence.

 

When a potential buyer uses Kayak, or one of many travel aggregator sites (we'll stick with Kayak as an example), the platform stores a significant amount of information about the user for later sale and personalized ads. Additionally, Kayak stores a cookie on their computer that indicates when the user was on the site and what flight they searched for. If a user views flights to Oaxaca and checks prices the following day, the price inevitably rises (to create the perception of less supply/more demand). The value of the tickets remains the same, but the multiple views of the ticket indicates added value. Unintentionally, the user gouges their own prices.

 

This model succeeds on two planes. The user purchases the ticket as a result of the added value (out of fear that prices will only rise higher) or, aware of the process, the user shops impulsively, opting to avoid rising ticket prices over the benefits of looking around and comparing prices (you can circumvent the ticket-cost algorithm by clearing your cookies regularly). Behind this interface is the heart of Kayak's data model. Kayak's privacy policy on why they use cookies: "To serve you with advertising content in which we think you will be interested. As part of this customization, we may observe your behaviors on this website or on other websites. We may also get information about your browsing history from our trusted business partners." Is this not exploitative? Not only is Kayak manipulating ticket prices, they're using your data, selling your data and buying data from "trusted business partners." Is the buyer not to be trusted? Can we not decide what we want for ourselves?

 

We can, but prevailing sentiment concedes to Web personalization making our decisions for us. By returning to kKyak's privacy policy, specifically this clause (italics added): "To serve you with advertising content in which we think you will be interested," a snapshot of the current consumer atmosphere emerges where personalization reigns as the in-marketing tool. "We think you will be interested" evokes the transformational process that exposes our indecisiveness for exploitation. This transformation removes the user, and replaces our multitudes with a simplified digital projection. As our projection or data is passed around, marketers personalize us into a confined space; presenting the user with the choice to conform and receive all the Internet has to offer or refuse and miss out on the trappings of the Web. It's a precarious wedge with extensive implications for the destruction of personal privacy.

 

Neil Gaiman's Sandman contains this excellent quote: "Sometimes I suspect that we build our traps ourselves, then we back into them, pretending amazement all the while." It summarizes hindsight culture quite well, but it also speaks to humanity's flirtation with Singularity. The drive to destroy the private sphere of consciousness inextricably links Moore's law and Singularity with advances in surveillance, data mining and the systematic destruction of personal privacy. Singularity and privacy will not coexist, although the technology that propels us toward the Singularity needs privacy and its destruction to study human intelligence more acutely. As private consciousness becomes more available for examination and translation, Singularity becomes more realistic. Thus, the Singularity will occur, but only when personal privacy is compiled, analyzed and ultimately extinguished. The implications of this tradeoff represent an enormous paradigm shift in humanity, but those (engineers, computer scientists, and so on) impelling us toward the Singularity seem to shirk any moral universe. We'll revel in amazement post-Singularity that we hadn't employed moral scrutiny beforehand. But perhaps we wanted to omit scrutiny all along. I'll take Singularity for the future please, and hold the ethics.

 

Author Bio:

Tyler Huggins is a contributing writer at Highbrow Magazine.

 

Photos: Wikipedia Commons; Beverly and Pack (Flickr); DSearls (Flickr); Tom Murphy (Flickr).

Popular: 
not popular
Photographer: 
Wikipedia Commons
Bottom Slider: 
Out Slider