Category Archives: Surveillance

Xinjiang: A Pokemon Journey to America (Part Three)

This is the third and final post in a brief, un-academic series about my personal experience of living in China’s troubled Xinjiang region, and the censorship both online and offline that it entailed. This functions largely as a final whimsical anecdote and a conclusion. You can read the background information here, and several other anecdotes from my time in China here.

I previously wrote about having my phone service shut down for using a Virtual Private Network to circumvent the ‘Great Firewall’ and use Facebook, Skype, and other foreign apps.

Well eventually Pokemon Go released, which several foreigners in my social circle downloaded and started playing. Given that Pokemon Go makes use of Google services to function, this was only possible by running the game through a VPN–the same kind that got me shut down several months before.

In an English class one day I discussed starter Pokemon choices with some of my students. They informed me that Pokemon Go was an American conspiracy to locate military bases in China. These students still played it anyways, though.

Not eager to be an unwilling participant in a supposed clandestine mapmaking operation, but a childhood lover of Pokemon, I knew I had to get back online.

A friend helped me register my passport with different cellphone carrier from the one that had shut me down, and I finally bought a new SIM card. By that time we knew I would be leaving China within a few months anyhow, so I went for broke and kept my VPN on 24/7. I didn’t end up getting shut down a second time, though it’s possible that if I had stayed it would have happened eventually.

What was curious to me was that while playing the game, I regularly found evidence of other players active in my area, despite having to use a VPN for it to work, and reports that it wasn’t supposed to function in China at all. One day I decided to use the in-game clues (active lure modules) to find others who were playing. After an hour of wandering from pokestop to pokestop, and setting a few lures of my own to draw out other players, I ran across three young guys in front of a movie theatre. It suddenly dawned on me that my Chinese vocabulary included exactly zero Pokemon terms. In the end I simply showed them my phone and smiled. They showed me theirs and laughed, and we all spent about ten minutes trying to get to an inconveniently placed pokestop. 

I wish I could properly follow up on Pokemon Go in Xinjiang. The number of players I found evidence of in Xinjiang was initially surprising, but it shouldn’t have been. The Chinese are notorious for their zealous adoption of mobile games, and the restrictions on Pokemon Go were relatively easy to circumvent. I even had a ten year old ask me to recommend a VPN service one day after class. 

I later learned that at that time Pokemon Go was unplayable even with a VPN in most of China, even in major cities like Beijing and Shanghai. But it was functioning well enough in Xinjiang, one of the more sensitive and closely-controlled regions. I never made sense of that.

I’ve now taken my Pokemon adventure (and the more mundane aspects of my life) out of China. But there are certain remnants of the surveillance and censorship apparatus that stick with you even outside the country.

When I visited my father over Christmas, for example, he picked me up from the airport and we went straight to a restaurant for breakfast. “What’s Xinjiang like, then? Do the people there want independence like in Tibet?*” he said. My stomach twisted and I instinctively checked the restaurant to see who might have heard. Of course nobody present cared.

(* – This is an oversimplification of the Tibet situation, but this post isn’t about that)


A Chinese Christian friend of mine related a similar experience she had: after years of fantasizing about boldly professing her religion, when she finally moved to America she simply could not feel comfortable praying without drawing the blinds first. Similarly, my girlfriend has physically recoiled once or twice when I spoke the name of a well-known Chinese dissident out loud in our thin-walled apartment. Every time she’s caught herself and said aloud “Oh, right. Nobody cares here.”

China is not Oceania; there is not really anything like thoughtcrime. But there are speechcrimes. And when certain things are spoken, especially in a full voice, you know in your stomach that those words could get someone in trouble if the speaker isn’t careful.

Before I moved to Xinjiang I had it in my mind that I might like to study Western China when I eventually return to school to pursue a Masters in Anthropology. But now I’m no longer certain if I can: as alluded to already, I met a wonderful woman in Xinjiang. We’ve been together for more than a year now, and we moved to the US now so she can attend a graduate program. While we will certainly return to Xinjiang in the future, the continuing presence of her family there, as well as my girlfriend’s Chinese passport make me ever-conscious of Chinese government’s attitude toward those who are critical. Even though I am against extremism of all kinds, and believe that independence would fly against the interests of those living in Xinjiang, the caveats I would attach to those positions are likely unacceptable to the regime.

And so, perhaps even what I’ve written here is too much to say.

If you have questions or requests for clarification please don’t hesitate to comment below. And as a good friend regularly says: “Every day’s a school day,” so if you’d like to suggest a correction, or a resource or if you otherwise take issue with something I’ve said, please don’t hesitate to comment either. If there is interest, I would love to contribute to Socionocular again.

Xinjiang: Several Anecdotes (Part Two)

This is my second post to Socionocular on the topic of Xinjiang, a far-western region of China where I lived for 18 months. For a very basic overview on Xinjiang and China’s internet and social media landscape, please see here.

What follows are several anecdotes from my year and a half in China regarding the topic of internet (and other) censorship and the atmosphere of distrust and paranoia it fosters.

In my first year in Ürümqi I regularly attended a weekly English club with a close network of Chinese professionals. We ate in a private dining room at a small mom-and-pop restaurant, and it was an intimate-enough group that sometimes conversations turned political. There were a small few who spoke without hesitation, but without fail somebody would get up to close the door before having their say.

I remember it was during one of these conversations that a relative newcomer to the group asked me out of the blue if I ever called home to Canada.

“No,” I told him, “I just use the internet.”

“Good,” he replied, “Someone might listen, if you called.”

This sort of caught me by surprise, so I probed him: “What, like American spies?”

He shook his head no, but then thought about it. “Maybe them too. But I mean the Chinese police.”

There was also a (in my opinion) completely reasonable belief among many of the foreigners that many of our apartments might be bugged as well. A friend who attended college in China over a decade ago says he was told flat-out that the foreign student dormitory was being recorded, but not monitored. “If something happens,” he was told, “They can go back and review the recordings.”

I visited another friend’s apartment once, and it came up in conversation that he believed his place was definitely bugged. When I asked him how he knew, he said that another foreigner had lived in it before; he had been friends with him and spent a lot of time visiting. One day a young policeman in the neighborhood warned him to be careful what he said when he visited. Despite the ominous warning, he liked the neighborhood and tried to rent an apartment nearby, but was blocked by the police. I know others who were barred from this neighborhood as well. Finally, though, after his friends moved to a new home he contacted their landlord and inquired about their old apartment. He was given permission to move in by the same police who had told him it was impossible before. The popular opinion was that they probably approved it because they wouldn’t need to go through the trouble of re-bugging the place.

Again, it isn’t to say that everyone’s apartment is bugged. But it’s very telling to me that whenever the topic came up, the usual verdict was usually ‘Yeah, could be,’ and never ‘No, stop being paranoid.’

It isn’t just the foreigners who are so concerned, either. In fact, many locals are subject to more immediate, more real, and more invasive surveillance. Early in my stay in Xinjiang I made friends with a university student who offered to show me around the city. We kept in contact and enjoyed chatting from time to time. On our third or fourth meeting he confessed to me that he was a practitioner of an illegal religion. He said he was never worried about telling foreigners this, because foreigners never seemed to care. “But,” he gestured to the students sitting at the table next to ours “If I told you this in Chinese I might get in trouble.” He recommended some reading materials I should look up, and then spent some time laying out his burdens: at college many of his classmates regularly had their phones confiscated and searched. He showed me both of his phones: one for storing his religious materials, and the other one ‘clean’ so he could hand it over to be searched without worry.

For all the presumed monitoring and censorship, the most people I associated with largely got on with their lives without worrying about it too much. It was occasionally a topic of conversation, or a bit of a game (such as conspicuously whispering pro-China slogans into lamps), but it became a more immediate personal concern in the summer of 2015.

I only briefly mentioned the “Great Firewall” before this, partly because its reputation precedes itself. What some people don’t realize, though, is that most of the time it isn’t terribly difficult to circumvent. There are multiple free Virtual Private Network (VPN) options that will allow one to access Facebook and Youtube, and most foreigners I know made use of one or two VPNs regularly. Some Chinese people I knew also made use of VPNs, but others considered them too much of a hassle for too little benefit (I was once told “Why pay go through so much trouble to use the foreign internet? The Chinese internet has everything I want.”). Word started to circulate that China Mobile, one of the larger cellular carriers, was shutting down service to those running VPNs on their phones. One night at a regular foreigner hang-out, a friend told me about his  experience. He had his phone service cut, went to China Mobile to have it reconnected,  but was referred to the Public Security bureau. The PSB  instructed them to delete all “foreign communication apps” including Facebook and Skype, and submit their phones for inspection before they could be re-activated.

It was clear that the network could tell if a phone was making use of a VPN, but the shut-downs seemed random. A little less than half of my friends got shut down, and I myself continued to use my phone without issue for another six or seven months after that. At the end of January 2016, though, my phone service finally stopped without warning or explanation. At first I thought I had just run out of credit, so I sank some money into my account to put myself back in the black. This didn’t work, so I told my company’s foreigner-handler that I had been shut down. She took me to the Public Security Bureau where they asked me to hand over my phone so they could have a look. I had changed my SIM over from my iPhone to the cheap Nokia phone I had bought in my first week in China. They observed that there were no foreign apps and told me my phone would be re-activated in two weeks. The whole visit took less than fifteen minutes.

They must have figured out I was being less than honest, though, because two months later my phone still hadn’t been reactivated. Otherwise they lost the paperwork or some other institutional failure caused a problem. In any case, I gave up on getting my phone service for a time, and satisfied myself with hopping from wifi hotspot to wifi hotspot around the city for a few months.

Though there are others, these anectdotes are roughly representative of my experience in China. I’ve avoided providing too much biographical info, and/or changed a few details where they are inconsequential, to guard against an unlikely situation where post blips on the Chinese radar.

I will conclude in a third and final post to discuss one last anecdote concerning a personal vice of mine, and the lasting echo of censorship that rings even after leaving China.

Flight of the Drones: UAVs and public space

Unmanned aerial vehicles (UAVs), otherwise known as ‘drones’ have increased in popularity over the past decades for recreational and commercial purposes. The amount of drone purchases has risen dramatically and it is projected to continually rise in the years to come. And this is due in part of the technology being affordable for purchase, ranging anywhere from $20.00 to $1000.00 depending on the size and capabilities of the quadcopter.

This technology has been showcased in providing vital aerial perspectives for photography, which has begun to be used by real estate agencies and freelance videographers. Other prominent groups within society have begun to incorporate drone use within their repertoire.

Companies such as Amazon seek to use drones for package delivery; law enforcement agencies have begun to see the benefits of using drones, especially when it comes to crime scene photography and search and rescue missions; and now journalists have also begun to use drones in their arsenal for better reporting of events.

The potential benefits of this technology are almost limitless but this can also be said for its potential consequences.

To begin with, it helps to understand what ‘drones’ really are. Fundamentally, the technology is just a platform with propellers, because of this it can be fitted for just about any task and the aerial movement provides advantageous in accomplishing the job.

Since 1911, humanity has witnessed the power and control that comes with air superiority. Left unchallenged and unchecked, control of the skies allows for an unhindered use of drone technology. It will be done by those with the power to do so, allowing for their superiority to be better represented by those subjected to the drone’s presence, all the while being able to operate the device at a safe secure location, often away from the scene.

Though it may seem that this technology is relatively new, it has been around since the late 1800s with air balloons being used to as bombs to target enemy cities. Though the technology really got going during World War I by Dr. Elmer Ambrose Sperry who invented the gyroscope, thereby allowing for more precise targeted missile strikes. The technology subsided for a while and was picked up again in the 1950s, this time serving as surveillance and recon for military operations.

As they’ve continued to develop, drones have transformed into hunter-killer devices in wartime settings and are now being operated for police surveillance in domestic areas. However, the lineage of these devices is not so clear cut, as the influence by remotely piloted airplanes by hobbyists has also been attributed to this sudden rise in this technology in a recreational setting.

Now, technology itself is neither good or bad, it is typically taken as neutral. What is important is who is using the technology and for what purpose, coupled with the perception of the drone by the intended target. In this is where the debate and controversy lies. With the constant maneuverability and over encompassing visual surveillance that drones are capable of—questions emerge: who really benefits from this technology when it is in use? Which groups benefit when the police use them? When corporations use them? When journalists use them?

7704927414_0a9b28ca6d_o

With the emerging trend of police drones, it will be of no surprise to see security rhetoric being used as a tool to influence the populace into believing that we need drone surveillance as a way to feel safer and for it to be easier to catch the ‘criminal.’ Which was exactly what was done with the implementation of CCTV (closed circuit television) cameras. When in reality, it may be used as a tool of control that will benefit the rich and powerful and will be used on the disenfranchised resulting in particular communities and populations being disproportionately exposed to police surveillance.

In addition to providing a hindrance and an interpretation of a violation of our privacy rights.

Drone technology has the capacity to violate our privacy rights. However, this violation can become obscured by the shaping of public understanding mentioned above. This illusion of security is known as ‘security theater,’ a concept coined by Bruce Schneier as a way to explain the countermeasures that are set in place to provide the feeling of improved security while doing little or nothing to actually achieve it.

And what is surprising about all this, is that this is already happening, just on another medium, data collection is routinely gathered on everyone—government and corporate agencies can easily identify your behaviors and whereabouts stored in carefully constructed profiles, all by analyzing the data from your phone or computer.

A drone’s visceral appearance amplifies how we perceive drones and their impact on our privacy. The thought of being watched and the loss of control over surveillance puts individuals in a state of unease. It’s right there, potentially watching you and it may require a lot of effort to do something about it.

The drones are here and their flight has begun. However, it will be important to note who’ll be using the devices and for what purposes. The use of such technologies are often a reflection of its users.

Guest Blog: Brandon Rodrigues

 

The Mythology of Pokémon Go: Surveillance, Big Data, and a Pretty Sweet Game

Pokémon Go is lulling the world in to a humungous augmented distraction. A distraction that is covering up some pretty intense politics. It is almost as if we stepped into Ernest Cline’s Ready Player One—where distraction through virtual reality meets the war between anonymity and surveillance.

PokemonGo3
Artist: Dani Diez. You can find more of Dani’s work at www.instagram.com/mrdanidiez/

It has been well publicized that this new app, of which is fueling a Pokemania (a nostalgic resurgence of interest in Pokémon every time a new rendition of the game is released), has some rather arbitrary and invasive access to your mobile phones data—particularly, unhinged access to your Google account and other features of your mobile device.

What is Pokémon Go? This almost seems pointless now, seeing the popularity of the game—but for those of you who have not tuned in to the pokemania. Pokémon was a TV show released in the late 90s, which became dream fuel for a generation of children and young adults. It featured a young boy, Ash Ketchum, who embarked on a Journey to capture Pokémon in a technology known as the “pokeball” through the direction of the Professor (A man who studies Pokémon). After the Pokémon is caught, the young boy (and the thousands of other Pokémon trainers) would aspire to train it to battle other Pokémon.

Shortly after the show caught on, Nintendo released Pokémon Red and Blue for the Gameboy Colour. These games became an absolute hit. I remember walking to school with my eyes glued to my little pixelated screen—traversing over roads and dodging cars while battling with Pokémon and trading them with my other schoolyard peers. The games slogan repeating through my cranial, “Gotta Catch Them All”.

Nintendo continued to release Pokémon games designed for their various game platforms up until present. Each successive game included an obsessive and nostalgic excitement that took over the gaming community. Or anyone who had grown up playing Pokemon Red and Blue, as well as collecting the Pokemon cards.

Pokémon Go is a game that can be played on a mobile smart phone that uses geolocational data and mapping technologies that turn the phone into a lens peering into the Pokémon world.  Through the interface of your mobile device, you can catch Pokémon wandering the “real” world, battle through gyms, and find items that will aid your journey. It augments the world around the user so that everything and everywhere becomes a part of the game.

Just like its predessor, a game known as Ingress, many of the geo features in the game were set up around important places: art exhibits, cultural or historical sites, and parks. Following the maps would lead you through a productive tour of a cities geographical culture.

I want to explore the obsessive and nostalgic excitement through a techno-socio-cultural lens. I will unpack this critique into three parts: (1) the sociology of privacy, (2) Big data and algorithmic surveillance, and (3) the culture of nostalgia and the digital sublime.

Before I continue with this post—I want to assert that it is not an all-in-all terrible, megalomaniac, Big Brother type game. Pokémon Go is enabling new ways for people to engage in the social world. Check out this sociological blog post exploring just that. However, it would be silly to not apply a critical perspective to this.

13814436_1773471672910323_275681057_n
Taken from Facebook Page: https://www.facebook.com/wokemon/?fref=ts

There are some restrictions I’d like to apply to my analysis: (1) Pokémon Go is not an immature or irrelevant activity, millions of people of all ages and cultural backgrounds are playing it—meaning it has a ton of significance. As well as, (2) The people playing Pokémon Go are not zombies or passive consumers, they are very intentional and unpredictable social actors that have the ability to understand their situation.

Sociology of Privacy

One thing that boggles the minds of surveillance studies scholars is how the vast population of people using social media and mobile applications do not care about invasive surveillance embedded in everything they use.

In my own interviews of Facebook users in 2014, many of my participants claimed, “I have nothing to hide”. A pervasive mentality that enables big corporate and governmental entities to gain access and control to large swaths of data. This nonchalant attitude towards surveillance allows for massive ground in the dismantling of our rights to privacy. Though such an attitude is not surprising, as the entire ecosystem of social media is set up to surveil.

David Lyon, in his book “Surveillance After Snowden”, asserts that privacy is generally seen as a natural and democratic right that should be afforded to all citizens—but admits that a problem lay in that informational privacy is not as valued as bodily or territorial privacy. Even if information, data, and metadata are much more revealing than the both bodily and territorial surveillance.

Lyon notes three important points about privacy that are all very relevant to the current epidemic of pokemania: 1) the collecting of information has now been directly connected to risk mitigation and national security, implying that we are not safe unless we are surveilled. 2) Everyone is now a target of mass surveillance, not just the criminal. 3) Data collected through mass surveillance is made to create profiles of people—these may be completely inaccurate depending on the data collected, but you will never know the difference.

I would like to add a fourth. How can the data be used to swing massive profits? The corporation Niantic, creators of Ingress and Pokémon Go, use their privacy policies to legitimate “sharing” (sic: selling) of data with governments and third party groups. Government surveillance is often the focus of criticism. However, capitalist corporations are not often held accountable to ethical practices. Who is selling this data? Who is buying this data? And what is this monetized data being used for?

As Lyon asserts, Privacy is not about individual concerns—it is important socially and politically for a well-balanced democracy. Edward Snowden has been known to say, “It’s not really about surveillance, it’s about democracy”. While we continue to allow powerful groups to chip away at our privacy for entertainment, we literally give up our ability to criticize and challenge injustice.

Snowden reminds us that when we give up our democracy to the control room—there is zero accountability, zero transparency, and decisions are made without any democratic process.

So while we are distracted trying to catch a Snorlax at the park, we are giving away more and more of our lives to mysterious and complicated groups that want nothing but large profits and control. For a much more scathing review of this, see this blog post on surveillance and Pokémon.

Big Data and Algorithms

So what about the data. What is big data? First off, it’s all the craze right now. As data scientists, social scientists, policy makers, and business gurus scramble to understand how to use, abuse, and criticise such a thing. Big data is consistent of two large disciplines—statistics and computer science. It is the collection and analysis of unthinkably large amounts of aggregated data that is collected and analyzed largely by computer software and algorithms.

Boyd and Crawford (2012) offer a much more precise definition. They assert that Big Data is a “cultural, technological, and scholarly phenomenon” that can be broken into three interconnected features:

  1. Technology – Computer science, large servers, and complicated algorithms.
  2. Analysis – Using large data-sets compiled from technological techniques to create social, political, cultural and legal claims.
  3. Mythology – Widespread belief of the power of Big Data to offer a superior knowledge that carries immense predictive value.

The big problem that remains is how to find, generate, and collect all of this data? In terms of social media and video games much of this has to do with offering a “free” service to consumers who take the role of the “prosumer”. The prosumer is a social actor that both produces and consumes the commodity they are “paying” for.

In terms of social media (like Facebook), while users interact with each other, they are producing affective or emotional data through liking things, sharing things, and discussing things, that are then collected by algorithms and fed back into the system through targeting advertisements. The user is implicit in both the production and consumption of that data.

The user is given free access to the social media platform, however, they pay for it through giving the platform a transparent window into their lives that is than monetized and sold for large profits. People’s reactions to this form of surveillance are variant: some people offer scathing criticisms, others don’t give two shits, and some act a little more cautious.

Why is this important for Pokémon Go? Because you trade your data and privacy for access to what Pokémon Go has to offer. It is incredibly clever of think tanks in Niantic—using the nostalgic Pokemania to usher users into consenting to ridiculous surveillance techniques.

It gets worse. As Ashley Feinberg from Gawker identified, the people responsible for Niantic have some shady connections to the international intelligence community. Causing some in the surveillance studies field to fear that Pokémon might just be an international intelligence conspiracy (It sounds crazy—but it makes complete sense).

David Murakami Wood coined to the concept of “vanishing surveillance”. This is a phenomenon, intentional and unintentional, that allows surveillance capacities in devices to fade into the background. Resulting in users not being aware, or at least completely aware, that they are being watched. Pokémon Go, an innocent video game that is enabling new ways of being social in public, becomes an invisible surveillance device that may have international and interpersonal consequences. And it is the Pokémon themselves that allow for the surveillance to vanish from sight and mind.

A Culture of Nostalgia

gameboypokemon

So what drives people to consent to all of this? What kinds of cultural patterns allow and shape us to an almost fanatical state when a Pokémon game is released?

The first factor within the culture of Pokémon is its appeal to nostalgia. Jared Miracle, in a blog post on The Geek Anthropologist, talks about the power of nostalgia. It taps into the childhoods of an entire generation—it even moves outside the obscure boundaries of gamer culture into the larger pop cultural context. It wasn’t only geeks that played Pokémon. It was just about everyone. This might provide an explanation to why so many people are wandering around with their cell phones before them (I’ve seen them wandering around Queen’s campus today, while I was also wandering around).

However, it is not all about nostalgia. I believe that the nostalgia plays a role in a bigger process of the digital sublime and the mythologizing of the power of media.

What is a mythology? According to Vincent Mosco, in his book The Digital Sublime, defines myth as, “stories that animate individuals and societies by providing paths to transcendence that lift people out of the banality of everyday life”. This is a form of reality that represent how people see the world from the every-day-life perspective.

Myths are also implicit in power. “’Myth’ is not merely an anthropological term that one might equate with human values. It is also a political term that inflects human values with ideology… Myths sustain themselves when they are embraced by power, as when legitimate figures… tell them and, in doing so, keep them alive”.

These myths, along with nostalgia for Pokémon paraphernalia, generate the digital sublime. A phenomenon that has us go head over heals for new technology. The mythologies that support it can be positive or negative.

Positive mythologies might sound a little like this: “Pokémon Go is allowing us to leave our homes and experience the world! We meet new people and we are empowered by new ways of interacting with each other. Hurrah!”.

Negative Mythologies are also important: “Pokémon Go is creating a generation of zombies. People are wasting their time catching those stupid Pokémon. They are blindly and dangerously wandering around, falling off cliffs, and invading private property. Damn those immature assholes”.

Both of these mythologies cross over each other to colour the experiences of those who play and those who watch.

We need to be careful of generating mythologies about the capacity for games to facilitate freedom, creativity, and sociality. We also need to be careful not to apply to much criticism. Such mythologies not only create a basic, overly simplistic way of understanding gaming, surveillance, and human culture, it also blinds us to nuance and detail that may be important in its broad understanding.

So while people dangerously block a highway to catch rare Pokémon, walk over cliffs because they aren’t paying attention, or disrespectfully attempting to catch Pokémon in Auschwitz, there are also people who are leaving their houses to engage with the world, using Pokémon to fight depression and other mental illnesses, and creating super cool maps of rare Pokémon spots.

Drawing things together—A Political Economy of Pokémon

200
Don’t be so paranoid.

Taking a techno-socio-cultural perspective allows us to engage with Pokémon Go with a nuanced understanding of its positive and negative characteristics. It is possible to look at how this media creates a complex ecosystem of social concerns, political controversies, and cultural engagements with nostalgia, mythologizing, and capitalist enterprise.

Pokémon Go is indeed enabling a ton of new ways of interacting and helping people with mental illness get out of their homes to experience the world—however, we can’t forget that it is also an advance technology developed by those who have interest in money and power.

Regardless of the benefits that are emerging from use of this application, there are still important questions about privacy and the collection and use of Big Data.

So Pokemon Go isn’t just enabling new ways of being social with the larger world. It is enabling new ways of engaging with issues of surveillance, neo-liberal capitalism, and social control through the least expected avenues.

After all of these problematics become more and more public—will we still trade off our freedom for entertainment?

Oligoptica: Why Surveillance Isn’t Perfect?

We have all likely heard of the panopticon. An architectural design of a prison, thought up by Jeremy Bentham, that was suppose to maximize surveillance capacities so that prisoners always felt as if they were being watched. Even when they weren’t. It consisted of prisons revolving around a central guard tower that could watch every move of every prisoner, all the time. However, the guard tower is made to be opaque—so the prisoners can’t watch the guards.

panopticon

In 1975, Foucault borrowed this idea to illustrate his concept of disciplinary power in one of his most famous books—Discipline and Punish. The basic idea around Foucault’s use of the panopticon is that when people feel as if they are constantly being watched, they begin to self-discipline. The panopticon can refer to a prison. But it is meant to refer to society in general. Or many of the institutions in a society. The more people feel that they are being watched, the better they act. This watching could be through authorities, or even, your neighbors.

Though Foucault’s concept of disciplinary power is super important to many who study sociological theory—his example of the panopticon is overused and often misleading. It does not accurately represent the nature of surveillance in contemporary society.

The idea of the panopticon better characterizes a society of “total surveillance”. A completely, balls-to-the-walls, 1984, Big Brother-type (dys)utopia. Thankfully, there is currently no technology on earth that can allow for total surveillance.  We may be a society of ubiquitous surveillance, but not a society of total surveillance.

So how do we “move beyond the panopticon”, as so many social and cultural theorists have been calling for? There is one useful theoretical framework that builds on Foucault’s work. This is the concept of the oliopticon. A concept that was proposed by Bruno Latour during his incredibly critical arguments of Reassembling the Social.

Latour criticizes Foucault for drawing up a total surveillance “utopia” that is made of “total paranoia and total megalomania”.

giphy

He writes,

“We, however, are not looking for utopia but for places on earth that are fully assignable. Oligoptica are just those sites since they do exactly the opposite of panoptica: they see much too little to feed the megalomania of the inspector or the paranoia of the inspected, but what they see, they see well…”

 

Latour is staunchly reminding us that something that is everything is nothing at all. The panopticon is made to be too perfect. It is made to see all. It’s something, that as academics, we can’t possibly empirically record or understand. But the oligopticon is the existence of countless scopes meant for watching. Countless surveillance devices. They only see everything together, but because they rarely communicate it could hardly be called “total surveillance”.

Latour continues,

“From oligoptica, sturdy but extremely narrow views of the (connected) whole are made possible—as long as connections hold. Nothing it seems can threaten the absolutist gaze of the panoptica, and this is why they are loved so much by the sociologist who dream to occupy the center of Bentham’s prison: the tiniest bug can blind oligoptica”.

However, this does not entirely rule out the panopticon. As Kitchin and Dodge in their book Code/Space assert, the power of codes and algorithms may some day be able to unite many of the streams of the oligoptica to create a menacing panoptic machine. However, due to the unstable nature of the practice of scripting code, running code, and working hardware—it is liable to bugs, errors, and absolute mutiny. So don’t hold your breath.

The panopticon, for now, has its place—but it’s a more appropriate theme for a science fiction novel than a good work of social science or philosophy. It serves as a powerful reminder of where a ubiquitous surveillance society could lead us, but not as a very good characterization of surveillance today.

The Slender Man, Legends and Cultural Anxieties

Surveillance is being called ubiquitous by most of the leading scholars who study the social, political, and cultural ramifications of surveillance technology. A focus that I have been studying and thinking about is how surveillance is understood by everyday people living everyday lives.

I do this through the lens of Folklore, the study of everyday life. Or the study of the Folk (lay-person). This is obviously problematic—as such a term equates everyday life with peasantry. So for the remainder of this post I will use the term vernacular performance (i.e. everyday performance).

I’ve written about this work in the past. One of the ways that we demonstrate our cultural anxieties and fears is through the collective performance of legend cycles. In this case, I am speaking about the boogieman of the Internet—the Slender Man.

tumblr_n1ifr0KqIz1toa6e6o1_500

What is a legend?

Legends are repetitive and variant. Meaning people tell it over and over again, and as it is told and spread it changes form while keeping a central theme. Legends are a performance between storyteller and audience. Meaning that people perform legend cycles. A teller typically recounts a story to a listener or audience. This does include digital legends. Finally, Legends are not constructed by the teller, but by the community. The interaction between the storyteller and the audience constructs the story and allows it to spread. It is a collective process.

The Slender Man is a creature born the performative interactions of a group of users on the forum Something Awful. The Slender Man is a tall, monstrous figure. One that resembles a tall man in a black suit. He has no face, and extraordinarily long arms. He is sometimes depicted with many moving tentacles. All of this, and his many disproportions give it a Lovecraftian appearance. An eldritch monstrosity.

Cultural Monsters

As Tina Marie Boyer (2013) asserts in terms of the Slender Man, “a monster is a cultural construct” (246). And as such, understanding the ‘anatomy’ of a monster sheds light on the problems people face in their day-to-day existence.

tumblr_nrr2stfECA1uw7i1fo1_500

What is the anatomy of the Slender Man? I decided to do some ‘fieldwork’—exploring many of the blogs/vlogs that contributed to its legendary constitution. I found three major themes: Surveillance, Social Control, and Secret Agencies. This returns us to the topic of this blog post: The Slender Man is a vernacular performance that demonstrates our collective anxieties of a culture that is under the constant gaze of massive and complicated networks of surveillance.

Surveillance

The Slender Man is known to watch its prey. It is rarely confrontational, though it seems to relish in making its presence known. One scene that sticks out to me is from the YouTube series Marble Hornets—the main protagonist, after becoming increasingly paranoid of the faceless man in a business suit following him began to leave his camera running while he slept—only to discover that the slender man watches from a crack in his door while he sleeps. The Slender Man watches, seemingly from everywhere—but even when it is seen, the Slender Man has no eyes to watch from. It is as if it sees everything from nowhere. The Slender Man appears and vanishes, seemingly at will, haunting victims with little to no motive. The Slender Man represents the phenomenon of ubiquitous surveillance in the virtual world. A world where anonymity and pseudonymity are quickly disappearing. A world where only the experts understand what to surveil and how to read the data such surveillance produces. And a world haunted by faceless watchers.

Social Control

tumblr_nylbjo8sSi1umvov6o1_500

The Slender Man also represent themes of social control. The most obvious instance of this is the ‘proxies’, otherwise known as the ‘hallowed’. These are people who have been overcome by the Slender Man’s will. In many instances, the Slender Man legend telling ends in the main protagonists going mad and disappearing. They are either killed by the Slender Man (or its minions), disappear from time and space and sometimes memory, or are turned into a proxy. This means, they lose their minds and begin to do the bidding of the Slender Man. In the blog ‘Lost Within the Green Sky’, the main protagonist Danny describes it as a form of indoctrination that slowly drains the will from its victims. Even as a proxy, once their usefulness dries up – they are often killed. This theme is not surprising as it emerges from a cultural context that is known for its pervasive ability to control through silent software mediators.

Secret Agencies

The Slender Man is also known as The Operator (signified by a circle with an X through it). This name, along with the black suit it wears, makes the Slender Man a clear reference of Secret Agents. Those organizations who haunt the Internet, forcing those who wish to remain anonymous into the depths of TOR browsers and VPN applications. The Slender Man is representative of the NSA, FBI, CIA, CSIS, KGB and other notorious spy agencies operating with little oversight and behind a secretive veil. They are just as faceless as the Slender Man. And just as cryptic. Few understand the significance of their presence. And those who come under its haunting gaze have quite a lot to fear.

More Research

tumblr_nlmlhcF6Jt1tzlzbzo1_540

Folklore is a small branch of the social sciences.  There are few people who work beneath its flag. And fewer of those people study contemporary, digital folklore. However, this does not diminish its importance. Folklore offers us a lens to peer into how everyday people interpret the world through vernacular expression. It is an essential dimension to the surveillance studies canon. An understanding of how people interpret surveillance is essential if we are ever going to take action to educate people about its dangers.

‘Software Mediated Intimacy’ in Tinder: have we ever been in love?

Tinder has become an almost ubiquitous dating app that has built quite a lot of controversy even as it sits in most of our mobile phones.

While users swipe left and right searching for people they might be interested in, questions of legitimacy and authenticity in love, intimacy and dating arise. I was motivated to write this blogpost after a debate with a great friend of mine about the authenticity of love in Tinder.

Questions arose for me: If love is a social/cultural construct, is it authentic? What does Tinder do to change how love manifests itself? What has changed that we can’t right away see?

These tensions hang about within an atmosphere of heavy technical mediation that silently organizes, classifies, and structures how users interact with each other.

In an article by Fast Company, Austin Carr had the wonderful opportunity to interview Tinder CEO Sean Rad about the algorithm that measures user desirability and matches users to their equivalents. This is the ‘Elo score’ program.

Tinder users do not have access to their desirability rating, however, this rating does predetermine who gets to see who. It brings together a large and heterogeneous set of data on users in order to measure their desirability. Users only have access to viewing profiles that share a similar scoring.

Algorithms are active mediators in the construction of networks of intimacy. I would like to call this ‘software mediated intimacy’. Before we explore this concept we need to understand the basics and histories of two concepts: actor-networks and intimacy.

On actors, networks and the silent power of algorithms

Actor-networks are a set of conceptual tools originally devised and made popular by science and technology studies scholars Latour, Callon, and Law.

The most fundamental distinction of this school of thought, sometimes referred to as actor-network theory (ANT) is the principle of generalized symmetry. This concept holds that humans and nonhumans (atoms, systems, computers and codes) are equal in their ability to shape intentions of actors in networks of associations.

An actor, which can be human and/or nonhuman, is made to act.

This act is always done in relation to other acts.

So actors, together acting, create actor-networks. Humongous, unpredictable webs of associations.

This model of understanding society—through complex, overlapping networks of humans and nonhumans, is very useful for understanding how actors shape each other over social media platforms.

There are countless nonhuman’s working in sprawling actor-networks that shape, structure, and mediate how we interact with others online. Most of this remains hidden within our electronic devices. Strings of codes and wires affect us in ways that are sometimes more meaningful than our interactions with humans.

Some ANT scholars call this the blackbox—where a vast actor-network becomes so stable it is seen to act together and it becomes a node in other, more expansive actor-networks. Think about how many blackboxes exist in the many components of your mobile devices.

I would go as far to say that most Tinder users are unaware how much their romances are mediated by algorithms. These algorithms are typically trade secrets and the musings of Science and Technology scholars—not quite specters in the imaginations of users.

On the social construction of love and intimacy

Love is not universal, it is a complex set of changing associations and psycho-socio-technical feelings that are tied to space, time and historical circumstance.

Anthony Giddens, in his book The Transformation of Intimacy, demonstrates the social and cultural influences that actively construct how we understand, perceive, and pursue, love and intimacy.

In the past, the pre-modern, relationships were forged through arranged marriages that were mainly tied to economic priorities. This is something that Foucault, in his work The History of Sexuality, called the deployment of alliance—in other words, the creation of social bonds and kinship alliances that shaped the distribution of wealth and ensured group and individual survival.

In a complicated move that I am unable to detail here—the deployment of sexuality emerged that changed the very nature of the family. It also led to the love and intimacy that we know today—a social bond that is not necessarily tied to the family unit.

Gidden’s follows, as Western societies began to experience the characteristics of modernity—intimacy and love began to emerge alongside it. Romance emerged when the family institution of premodernity collapsed.

Why am I talking about this? I am talking about the work of Giddens and Foucault in order to illustrate the love and intimacy are a constantly shifting social and cultural construct. Every generation experiences love and intimacy differently.

This is probably the reason why your parents scowl about your one night stands and the fact that you haven’t given them grandchildren yet.

Love and intimacy is going through a massive shift right now.

‘Software mediated intimacy’—algorithmic love

In his essay A Collective of Humans and Nonhumans, Latour speaks about our progression into modernity as a deepening intimacy between humans and nonhumans. An integration between humans and their technology. A folding of all sorts of intentions and agencies.

In this case, as humans become further enveloped in their associations with mobile devices and ubiquitous social media. Most of our interactions become mediated through strings of codes, complex algorithms and digital hardware.

In the case of Tinder—the algorithms are mediating and structuring how we meet and communicate with people who may become love interests. Of course, these algorithms don’t exactly control such love. Human actors enter into associations with the Tinder actor-network with all kinds of intentions and expectations.

Some actors want a fling, a one night stand. Others might want a long-term relationship. Others may even just want attention or friendship.

So even though Tinder is sorting who gets to talk to who—a heterogeneous and constantly shifting clusters of intentions are constantly shaping how Tinder-based relationships will turn out.

And it’s not perfect. Love and intimacy falter and wilt just as much as it blossoms.

But that’s not my point. ‘Software mediated intimacy’, like all of the forms of before it, is another form of social and culturally curated love and intimacy. So it’s not authenticity that is the primary question here. All forms of love emerge from some degree of construction or performance. There is no universal standard.

However, what is different and what is often not seen is the degree to which love can be engineered. Computer scientists and engineers, at the beckoning of large (or small) corporations, embed particular logics and intentions into the algorithms they construct.

As Dodge and Kitchin remind us, in their book Code/Space, this is not a perfect product of social control—but a constant problem to be shaped. Even so, it is disconcerting that so many users are being shaped by silent human and nonhuman mediators.

Tinder’s algorithm and the ‘Elo Score’ is a trade secret. Not opened up to the public. Or the academe. So I am left scrambling for an answer that only raises more questions.

‘Software mediated intimacy’ can offer us a new and novel way to construct relationships, whether they are temporary or sustained. It is a form of social interaction that is just as authentic as prior forms of love and intimacy. However, the codes and algorithms and the complexities of computer science and statistics make it overwhelmingly easy for corporations to shape users based on private interests.

We must learn how these processes work and how they might be democratized so that the user may benefit as much as the capitalist who produces and hordes such software. We must embrace ‘software mediated intimacy’, in order to learn about it and master how it works so we might sidestep the potential and precarious exploitation seeping through our engineered social bonds.

Note: This blog post will be followed by a more in-depth analysis of this understanding of Intimacy. I am seeking to expand this into a larger, theoretical project. Any comments, criticism, or thoughts would be incredibly useful.

Visibility and Exposure at the Margins

I had a recent run in with the public eye because of an op-ed article I wrote for the Queen’s Journal on a controversial exploration of the limits of free speech in terms of the ability of the (in)famous Queen’s Alive (anti-abortion) group to table and canvass for supporters on campus.

This lead to a public debate and lots of discussion. It also lead to a ton of mudslinging and attempts at public smearing.

I had also experienced this in the past when myself and another advocate for queer rights filed a human rights complaint against a magazine for a publishing an unsavory article illustrating a scathing hatred for queer folk (they referred to us as evil and pagan). I was waist deep in understandably complex, multidimensional and very contested discourse. These discourses led to unpleasant hate messages and full transparency in the provincial (and to some degree, international) media.

That is not the topic of this blog post. What I would like to discuss isn’t the status of free speech or the unpalatable existence of hate speech. Rather I am interested in the intense visibility that activists (and others) are exposed to through unpredictable new media interactions. These interactions are typically escalated and amplified by the Internet. This is a dimension of contemporary surveillance not conventionally covered by many academics. It is the subjects of surveillance that I would like to explore.

Continue reading Visibility and Exposure at the Margins

Social Media: Moving beyond the Luddite trope

Social media is neither good nor bad, though this doesn’t mean it’s necessarily neutral as it certainly has the potential to exploit and empower. Nicole Costa’s rendition of her experiences and tribulations with Facebook in her recent article My online obsessions: How social media can be a harmful form of communication were incredibly touching. Her refusal and resistance to appearing and contributing to the Facebook community is empowering. However, I believe it is also misleading. Social media and digital exchange and interaction are here to stay (save for some cataclysmic event that knocks out the electrical infrastructure) and because of this I believe that we need to learn how to engage with it productively and ethically. We need to engage with social media in a way that doesn’t jump straight into a moralizing agenda. By this I mean illustrating social media as the savior of humanity or a dystopian wasteland where people’s communication collapses into self-absorbed decadence.

How do we maneuver this politically charged land mine addled cyberspace? First we need to recognize that a great number (in the billions) of the human race use social media (of all sorts) for many reasons. However, this is far too broad, let’s focus on Facebook. Facebook is among the most popular of social media with over 1.5 billion users and growing. It is built into the very infrastructure of communication in the Western world. If you have a mobile phone, you very likely have Facebook. You might even use Facebook’s messenger service more than your text messaging. Facebook allows us to share information, build social movements, rally people together in all sorts of grassroots wonders. As an activist, I’ve used Facebook to run successful campaigns. Why? Everyone uses it, and because of this, it has the power (if used correctly) to amplify your voice. Facebook, and most social media, can be very empowering.

But hold your horses! Facebook is still terrifyingly exploitative. Their access to your personal and meta data is unprecedented. Furthermore, they actively use the data that you give them to haul in billions of dollars. Issues of big data and capitalism are finally coming to the forefront of academic and popular discussion, but the nature of such complicated structures are still shrouded in obscurity. The user sees the interface on their computer monitor. But Facebook sees electronic data points that represent every aspect of the Facebook user(s) in aggregate. Through elaborate surveillance techniques, these data points are collected, organized, stored, and traded on an opaque big data marketplace. Furthermore, the user is not paid for their (large) contribution to the product being sold. They are exploited for their data and their labour—as everything you do on Facebook is a part of the data that is commodified and sold.

At the same time Facebook (and other prominent social media platforms) allow for an unprecedented freedom and speed of communication. They have been embedded into our everyday ways of socializing with each other. New social media have become an invaluable and ubiquitous social resource that we engage in from the time we wake to the time we sleep. It has been used to organize events, rallies and protests. It is used to keep in touch with distant family and friends.  It is used for romance, hatred, companionship, and debate. Facebook is playful and empowering.

So if you are like me than you may be absolutely confounded on how to resolve the tensions between Facebook (and other social media) being at the same time exploitative and empowering. We have gone too far down the rabbit hole of social media and digital communication to merely refuse to use it. It is now a intimate part of our social infrastructure. Those who resist through refusal may find themselves at multiple disadvantages in how they engage with the world. My own ethnographic research into why users refused Facebook illustrated that those who abandoned Facebook may have felt empowered by overcoming the “addiction” of social media, however, they also felt excluded and alone. And it must be noted that mostly everyone I talked to who had quit Facebook are now using it again. So clearly, refusal to use these services is not enough to meaningfully challenge problematics in social media.

The Luddites historically were textile workers who were opposed to the invasion of machines into their workplace. Machines that they figured would gouge away at their wages. Today, it is a term used for those who refuse to use certain technologies. In the realm of social media, a Luddite resistance has proved to be incredibly ineffective. It is also important to note that this sort of refusal obscures ways of meaningfully resisting mass surveillance and the exploitation of user data.

I propose the complete opposite. I propose the path of knowledge. We need to learn how to maneuver through social media and the Internet in ways that allow us access to anonymity. Ways of asserting our right to anonymity. This is critical. We need to mobilize and teach and learn through workshops. We need to scour the Internet for free resources on the technical perspectives of social media. We need to also spread awareness of this double edged nature of social media. It is no use to take a stance of refusal, to ignore the importance of social media, and thus remain ignorant to how it all works. When we do this, we actually empower these large capitalist corporations to exploit us that much more. The less we know about the calculus of social media and how it works on a level of algorithm, code and protocol, the more able the capitalists are at disguising and hiding exploitation.

Science Fiction, Mixed Media, and Surveillance

For those of us who have been reading science fiction for some time now—it becomes clear that SF has a strange propensity to becoming prophetic. Many of the themes in science fiction classics are now used as overarching metaphors in mainstream surveillance. Most notably among these is: Orwell’s Big Brother, Huxley’s Brave New World, and Kafka’s Trail. Other common tropes we might refer to is Minority Report, Ender’s Game, and Gattaca.

Though I am not trying to claim that these classics aren’t good pieces of SF literature, they may not do a superb job of covering issues implicit in contemporary surveillance. Imagine George Orwell coming to the realization that the Internet is one humungous surveillance machine with the power of mass, dragnet surveillance. Or imagine Huxley’s reaction to the lulling of consumer affect through branding and advertisement. The power of surveillance tools to control and shape large populations has become a prominent and dangerous feature of the 21st century.

As Richard Hoggart says,

“Things can never quite be the same after we have read—really read—a really good book.”

So let’s stop recycling old metaphors (if I read another surveillance book that references Big Brother or the Panopticon I’m going to switch fields). Let’s look at the work of our own generation of writers and storytellers. What I think we might find is a rich stock of knowledge and cultural data that could illuminate some optics into our (post)human relationship with advance technology.

The reason why I am using mixed media, as opposed to focusing on a singular medium, is that I believe that our relationship with media is not limited to one or the other. Novels, movies, video games, graphic novels and YouTube videos all offer us something in terms of storytelling. Part entertainment, part catharsis premised and constructed through the engagement with the story.  Our generation of storytelling has shifted into the realm of mixed media engagement.  What follows are some stories that I think are critically important to understanding the human condition in our own generational context.

P.S. They are in no particular order.

Disclaimer: Though I tried to be cautious not to forfeit any critical plot or character points, be careful for spoilers:

SOMA 

   SOMA1

SOMA is a survival horror video game released by the developers of Amnesia (another terrifying game), Frictional Games. It is a 2015 science fiction story that both frightens you and an imparts an existential crisis as you struggle to find “human” meaning between the fusion of life and machine. After engaging in a neurological experiment, the main protagonist Simon Jarrett, wakes up in an abandoned underwater facility called PATHOS-II. As opposed to people, Jarrett finds himself trapped with the company of both malicious and benevolent robots—some who believe they are human. The interesting overlap with surveillance here is the focus on neurological surveillance. Scientists (in and out of game) transform the biological brain into a series of data points that represent the original. From this, scientists hope to predict or instill behavior. Or in the case of this game, transform human into machine. This is done by literally uploading the data points of the brain in aggregate to a computer. The game instills a constant question: is there any difference between human consciousness and a copy of human consciousness? SOMA is more than just a scary game—it is a philosophical treatise on the post-human illustrated through an interactive story.

Ready Player One

Readyplayerone

Ready Player One, is a novel written by Ernest Cline, which covers a wide breath of themes: notably the uneasy relationship between surveillance and anonymity, visibility and hiding. Cline constructs a world that doesn’t seem very far off from our own. A world where people begin to embrace simulation through virtual reality (VR) as environmental disaster plagues the actual world. People hide in the sublime. The VR game, OASIS, a world of many worlds, is the home of many clever pop culture references. Mostly music, video games and movies. With an extra emphasis on science fiction. Embedded in this world of worlds is several “Easter Eggs” (surprises hidden in videogames) that act as a treasure trail to the OSASIS late founder’s fortune and ultimate control over the virtual world. Anonymity is the norm of OASIS—a utopian world where the original, democratic ideal of the Internet is realized. A place where anyone can be anybody—without reference to their actual identity. However, this world is jeopardized as a the corporation Innovative Online Industries is also searching for the Easter Eggs to take over OASIS and remake it to generate capital. The theme of anonymity vs. mass surveillance for profit is arguably a major fuel for global debate as all “places” of the Internet are surveilled in increasingly invasive ways. Anonymity has almost disappeared from the Internet, to be replaced with quasi-public profiles (Facebook and Goggle+) that exist to make billions of dollars off of people’s identities and user-generated content. The original dream of the Internet, sadly has failed.

Nexus

RN_rebrand_NEXUS_03-tiny-233x400

Nexus is a science fiction novel written by Ramez Naam following characters who are engaged with a new type of “nano-drug” that restructures the human brain so that people can connect mind to mind. There are those who support the drug and those who are against it. This conflict is followed by a slurry of espionage that exposes the characters to incredible dangers. The theme of surveillance in Nexus follows a new fixation on neuroscience. The ability to surveil the very essential, bio-chemical features of the human mind. As well as exposing mind and memory to others participating in this new psychedelic (psychosocial) drug. This is a level of exposure that far supercedes our experiences with the Internet and social media. Imagine being hardwired into a computer network. The book also follows traditional surveillance themes as the main character Kaden Lane becomes entangled in the conflict of private corporations and state government.

The Circle

The_Circle_(Dave_Eggers_novel_-_cover_art)
Social media in the 21st century has positioned Western society within the context of visibility and exposure. Most people are simultaneously engaged in self-exposure and participatory surveillance—as we post content about our lives and browse and read content about the lives of our friends and family. The Circle by Dave Eggers works this theme through a character, named Mae Holland, who has just been hired by the world’s largest IT company located in a place called the Circle. The Circle is a place, much like a University campus, with literally everything on it. This place boarders utopia—a place where work and play blends. However, following the mantra “All that happens must be known”, social media penetrates the lives of those who exist in the Circle in pervasive and exposing ways. Very quickly, the utopic illusion slips away into dystopia.

Slenderman

slide

Slenderman was, in its bare skeleton form, introduced to the Internet by Eric Knudson on the (in)famous Something Aweful forum board for a paranormal photo editing contest. However, within a year, Slenderman was sucked into a collective narrative construction across all media platforms. People blogged about it, tweeted about it, YouTubed about it. A massive and ever changing (and unstable) urban legend (or Fakelore) was constructed in the chaos of cyberspace. Slenderman, the paranormal creature, can be described as a tall man with unnaturally long arms and legs (and sometimes tentacles), wearing a black suit, with no face. It is usually depicted as a creature who watches, in other words surveils. It watches from obscure areas, slowly driving its victim to paranoia and insanity. Than the victim disappears, without a trace. Slenderman is the contemporary boogieman. But it also shares a narrative with dangerous, obscure, and mysterious secret police and intelligence agencies. As Snowden revealed to the public, governments, through mass surveillance techniques, watch everyone and everything. Could the slenderman narrative be telling of a deep seeded cultural fear of government surveillance in the 21st century? There are many ways to tap into this story—google blogs, tumblr accounts, and twitter accounts. But also, YouTube series’ like Marble Hornets, EverymanHYBRID, and Tribe Twelve. Also check out the genre called Creepypasta for an extra home brewed thrill.