Category Archives: Internet Culture

Musings of an (a)social collective: Anonymity and Community

tumblr_ojezsfb1vq1qedlsto1_500

Anonymous communities can easily be mixed up with as a thick mess of senseless social interactions. At least, that is how I saw this world when I first decided to study anonymous communities for my Master’s thesis. I thought I would study how surveillance operates in anonymous social media applications—specifically, a very popular (at the time) application called Yik Yak.

Just a side note: Yik Yak had gone into a sudden bout of madness and removed the ability to be anonymous from their application. After a complete revolt of their user base (they just about all left), they switched back. But the feed is still a smouldering ruin of regret and nostalgia. To simplify this argument for the sake of a blog post—let’s pretend that the application did none of this. Let’s make an ideal form: an anonymous community.

unnamed

When I first downloaded the app a month before I decided to dedicate two years to it—my room mate had convinced me to check it out. An seemingly infinite central feed of anonymous comments that were sorted by a slurry of up-votes and down-votes. The Yak feed is tied to a geolocational system that connects the app to particular locations. My Yak, was the Queen’s University Yak. It was a busy feed. And it was constantly changing. To me, it seemed to be a chaotic and nebulas thick tangle of associations. A fun challenge for a scholar following and Actor-network inspired philosophy.

The popular posts stood out from the unpopular posts by an upvote/downvote feature. It was kind of like a mash between Twitter and Reddit with a touch of anonymity.

After a stint of digital ethnographic work and a ton of interviews with enthusiastic and committed users I began to see something else. Something that, as an outsider, was invisible to me at first. There was an elaborately balanced Yik Yak community. As Gary T. Marx asserts, anonymity is entirely a social process. The only way for anonymity to occur is through a faceless interaction with another faceless person. This includes social regulations, exploitations, and oppressions. But also, playfulness and a culture of care.

I would like to play with a concept I’m thinking of called (a)social. ‘a’ can be used as a negation. ‘a’ can also be used to represent anonymity. But mostly, ‘a’ will be used to approach a society which remains almost entirely faceless. A community of people interacting around nothing more than posts from people who occupy similar space. Similar cultural values.

Though I have major problems with the corporate side of Yik Yak with their capitalist motives and try-hard branding schemes, their application has facilitated the construction of an elaborate community. It’s created an (a)social experiment. It is a community that both contains a culture of trolling and a culture of care.

All things are a collective endeavor. The (a)social communities are also a collective endeavor. In Donna Haraway’s most recent philosophical publication, Staying with the Trouble, she discusses her concept of sympoiesis—a collective unfolding of reality. This collective includes everything. All human, inhuman, and nonhuman components that are threaded into the collective mess.

When we load up Yik Yak to our mobile phones and post snippets of thought to the main feed (or engage in grueling arguments over all controversies in the comments)—we work with silicone, wires, codes, telecommunication companies, algorithms, molecules, humans, bots, and entire scaffoldings of bureaucracies, legal frameworks, and governments. Interacting with the Yak spans the world over.

Furthermore, the Yak’s platform—allows particular functions and blocks others—shaping its users to interact in particular ways. They impose standards, through their Code of Conduct, which they enforce through algorithms looking for offensive key words. And they sometimes change up everything in an update (to remove their main feature, anonymity). These are the institutional forces that shape and provide stability to the community.

However, I have noticed that there is something more powerful at work in maintaining the community. It seems that the mess of interactions from users balance out particular norms and ways of acting. This is done through both the comments section and the up-vote/down-vote feature. These are the vernacular forces that generate norms and cultures. Certain topics, maybe, offensive topics, are down-voted (a -5 score from votes deletes the comment from the feed). This vernacular power, though institutionally enabled, allows for a regulation of trolls and bullies without Yak’s employees ever having to get involved.

(a)social sympoiesis initially looks like a senseless and dense knot of relations. It’s noisy and confusing. But, once, as an ethnographer, you begin the arduous work of untangling these associations—it begins to look like every other community. Despite all of the contradictions, despite the arguments, the controversies, and the confusing faceless interactions—the Yak community is able to balance out, stabilize, and “hang together” as a coherent whole.

Though such an (a)social collective is not shielded from the larger world. Once, for whatever reasons or motivations, Yik Yak decided that their users didn’t want to be anonymous and forced every user to get user handles (and suggested they link up their Facebook page)—the entire community collapsed. All that is left are groups of Yak “refugees” with no where to go but to be visible to the world.

Xinjiang: Internet Censorship Laboratory (Part One)

I recently completed eighteen months of living in China’s far-western province of Xinjiang. As part of the coming-home process I contacted Kyle and offered to write a brief account of my experience in the ‘internet censorship laboratory of the world.’ What follows is a whirlwind of thoughts, opinions, and personal anecdotes that I will be the first to admit require much fact-checking and cross-referencing. Please consider them pages torn from my personal journal and shared with readers of Socionocular for their curiosity value.

One random day in mid-2014 three of my soon-to-be coworkers received text messages from the propaganda bureau of Ürümqi, the capital of Xinjiang Uyghur Autonomous Region in far western China. The messages reminded them that foreigners weren’t to be trusted, and that they must not share secrets with outsiders.

Which foreigners were these good Chinese citizens supposed to be wary of? And what secrets did three English teachers possess that could possibly compromise the safety of the nation? When later I asked these questions I betrayed my newcomer status. I would eventually conclude that all foreigners are suspect, especially in Xinjiang, and that the point is not so much to safeguard secrets as much as it’s to maintain the atmosphere of low-grade xenophobia.

The question that possessed my local friends was more pointed: why them? Broadcast text messages signed by the propaganda bureau weren’t uncommon, but this message was specific in its content and its recipients. For one, even though there were numerous foreigners working out of that office, only three of the more than two dozen Chinese staff got this particular message. As they chatted about it over lunch, they tried to work it out. One girl was dating a foreigner; the other was sleeping with one; the third was very close to a foreigner in a chaste, conservative Christian un-relationship that everyone could see through. But other staff had been so close with foreigners before. Besides, who would have been interested but inconspicuous enough to report these various liaisons to the propaganda bureau? And why would they bother?

The conclusion they eventually arrived at was that all three had used their ID cards to buy a SIM card for ‘their’ foreigner. That was the link.

And the phone company and Propaganda Bureau were evidently watching closely enough to notice.

To sign up for social media in China, most popular services require authentication using a mobile phone. In order to get a mobile number, one must register their government ID with the phone company before being given an activated SIM card. If the pieces fit together correctly anonymity is impossible on the Chinese internet. While I have friends who assure me that one can sever a link in this chain elsewhere in China, it’s much more difficult in Xinjiang.

The reason is, I suspect, that the stakes are higher in Xinjiang for the government, and so the fist is a little tighter. Like Tibet, Xinjiang is an autonomous region principally populated by China’s minorities, not the majority-everywhere-else Han. The Uyghurs who lend their name to the Xinjiang Uyghur Autonomous Region are a majority-muslim turkic ethnic group who share neither language nor culture with Beijing. The history of the region is complex, and contested, and supporting the wrong narrative or questioning the ‘right’ one is considered subversion.

20th century Xinjiang has been marked by episodes of pan-turkic, and separatist thought. There were two abortive independent states declared in the past century, both called East Turkestan. Both collapsed quickly. In the 21st century Beijing has bundled separatism with extremism and terrorism, labeling them ‘The Three Evils‘ which must be opposed at every level of society. The official line, packaged with China’s notorious control over the mainstream media has had the result of conflating each of the three evils with each-other.

The result of the party’s stranglehold on most of the news-media in China (if you’re curious, read The Party Line by Doug Young) is that the really interesting stuff is happening online. In China, the internet and social media have become somewhat of a haven for off-message thinking, mostly in the form of jokes. As mentioned, true anonymity is difficult on Chinese social media, but the Chinese language’s rich ability to cast puns has been used as a tool to avoid automated censorship, and make subtle jabs at those in power.

But the government has some surprisingly grassroots-seeming tactics of it’s own, such as its ability to rouse patriots to comment on the internet to support the party (mostly in Chinese, but also in other language). The use of paid government commenters is also an open secret. These paid internet posters are derogatorily called 五毛 (‘wǔmáo’), meaning ‘five mao’ (a unit of currency) because that is supposedly the going rate for one internet post (.5is about $.07 USD).

Ultimately, though, China is also willing to throw the switch completely. Similar to how Egypt did in the wake of protests in early 2011, China took all of Xinjiang offline in 2009 for 10 months in the wake of the Urumqi riots.

I’m sure you can imagine that in this atmosphere it’s impossible to take others at face-value unless you are very close with them. Very often people will claim apathy or ignorance when asked uncomfortable questions, or echo the official line even if they roll their eyes while doing so. Contrary opinions are not shared easily, and paranoia is pervasive.

There is much I haven’t even touched on, some of which has been discussed at length by others (such as the Great Firewall blocking Chinese citizens’ access to many foreign websites). Instead of repeating myself (or others), I’ll conclude this introduction to the situation here. Shortly I will follow up with another post containing a series of anecdotes which touch on this self-censorship and paranoia.

 

The Slender Man, Legends and Cultural Anxieties

Surveillance is being called ubiquitous by most of the leading scholars who study the social, political, and cultural ramifications of surveillance technology. A focus that I have been studying and thinking about is how surveillance is understood by everyday people living everyday lives.

I do this through the lens of Folklore, the study of everyday life. Or the study of the Folk (lay-person). This is obviously problematic—as such a term equates everyday life with peasantry. So for the remainder of this post I will use the term vernacular performance (i.e. everyday performance).

I’ve written about this work in the past. One of the ways that we demonstrate our cultural anxieties and fears is through the collective performance of legend cycles. In this case, I am speaking about the boogieman of the Internet—the Slender Man.

tumblr_n1ifr0KqIz1toa6e6o1_500

What is a legend?

Legends are repetitive and variant. Meaning people tell it over and over again, and as it is told and spread it changes form while keeping a central theme. Legends are a performance between storyteller and audience. Meaning that people perform legend cycles. A teller typically recounts a story to a listener or audience. This does include digital legends. Finally, Legends are not constructed by the teller, but by the community. The interaction between the storyteller and the audience constructs the story and allows it to spread. It is a collective process.

The Slender Man is a creature born the performative interactions of a group of users on the forum Something Awful. The Slender Man is a tall, monstrous figure. One that resembles a tall man in a black suit. He has no face, and extraordinarily long arms. He is sometimes depicted with many moving tentacles. All of this, and his many disproportions give it a Lovecraftian appearance. An eldritch monstrosity.

Cultural Monsters

As Tina Marie Boyer (2013) asserts in terms of the Slender Man, “a monster is a cultural construct” (246). And as such, understanding the ‘anatomy’ of a monster sheds light on the problems people face in their day-to-day existence.

tumblr_nrr2stfECA1uw7i1fo1_500

What is the anatomy of the Slender Man? I decided to do some ‘fieldwork’—exploring many of the blogs/vlogs that contributed to its legendary constitution. I found three major themes: Surveillance, Social Control, and Secret Agencies. This returns us to the topic of this blog post: The Slender Man is a vernacular performance that demonstrates our collective anxieties of a culture that is under the constant gaze of massive and complicated networks of surveillance.

Surveillance

The Slender Man is known to watch its prey. It is rarely confrontational, though it seems to relish in making its presence known. One scene that sticks out to me is from the YouTube series Marble Hornets—the main protagonist, after becoming increasingly paranoid of the faceless man in a business suit following him began to leave his camera running while he slept—only to discover that the slender man watches from a crack in his door while he sleeps. The Slender Man watches, seemingly from everywhere—but even when it is seen, the Slender Man has no eyes to watch from. It is as if it sees everything from nowhere. The Slender Man appears and vanishes, seemingly at will, haunting victims with little to no motive. The Slender Man represents the phenomenon of ubiquitous surveillance in the virtual world. A world where anonymity and pseudonymity are quickly disappearing. A world where only the experts understand what to surveil and how to read the data such surveillance produces. And a world haunted by faceless watchers.

Social Control

tumblr_nylbjo8sSi1umvov6o1_500

The Slender Man also represent themes of social control. The most obvious instance of this is the ‘proxies’, otherwise known as the ‘hallowed’. These are people who have been overcome by the Slender Man’s will. In many instances, the Slender Man legend telling ends in the main protagonists going mad and disappearing. They are either killed by the Slender Man (or its minions), disappear from time and space and sometimes memory, or are turned into a proxy. This means, they lose their minds and begin to do the bidding of the Slender Man. In the blog ‘Lost Within the Green Sky’, the main protagonist Danny describes it as a form of indoctrination that slowly drains the will from its victims. Even as a proxy, once their usefulness dries up – they are often killed. This theme is not surprising as it emerges from a cultural context that is known for its pervasive ability to control through silent software mediators.

Secret Agencies

The Slender Man is also known as The Operator (signified by a circle with an X through it). This name, along with the black suit it wears, makes the Slender Man a clear reference of Secret Agents. Those organizations who haunt the Internet, forcing those who wish to remain anonymous into the depths of TOR browsers and VPN applications. The Slender Man is representative of the NSA, FBI, CIA, CSIS, KGB and other notorious spy agencies operating with little oversight and behind a secretive veil. They are just as faceless as the Slender Man. And just as cryptic. Few understand the significance of their presence. And those who come under its haunting gaze have quite a lot to fear.

More Research

tumblr_nlmlhcF6Jt1tzlzbzo1_540

Folklore is a small branch of the social sciences.  There are few people who work beneath its flag. And fewer of those people study contemporary, digital folklore. However, this does not diminish its importance. Folklore offers us a lens to peer into how everyday people interpret the world through vernacular expression. It is an essential dimension to the surveillance studies canon. An understanding of how people interpret surveillance is essential if we are ever going to take action to educate people about its dangers.

‘Software Mediated Intimacy’ in Tinder: have we ever been in love?

Tinder has become an almost ubiquitous dating app that has built quite a lot of controversy even as it sits in most of our mobile phones.

While users swipe left and right searching for people they might be interested in, questions of legitimacy and authenticity in love, intimacy and dating arise. I was motivated to write this blogpost after a debate with a great friend of mine about the authenticity of love in Tinder.

Questions arose for me: If love is a social/cultural construct, is it authentic? What does Tinder do to change how love manifests itself? What has changed that we can’t right away see?

These tensions hang about within an atmosphere of heavy technical mediation that silently organizes, classifies, and structures how users interact with each other.

In an article by Fast Company, Austin Carr had the wonderful opportunity to interview Tinder CEO Sean Rad about the algorithm that measures user desirability and matches users to their equivalents. This is the ‘Elo score’ program.

Tinder users do not have access to their desirability rating, however, this rating does predetermine who gets to see who. It brings together a large and heterogeneous set of data on users in order to measure their desirability. Users only have access to viewing profiles that share a similar scoring.

Algorithms are active mediators in the construction of networks of intimacy. I would like to call this ‘software mediated intimacy’. Before we explore this concept we need to understand the basics and histories of two concepts: actor-networks and intimacy.

On actors, networks and the silent power of algorithms

Actor-networks are a set of conceptual tools originally devised and made popular by science and technology studies scholars Latour, Callon, and Law.

The most fundamental distinction of this school of thought, sometimes referred to as actor-network theory (ANT) is the principle of generalized symmetry. This concept holds that humans and nonhumans (atoms, systems, computers and codes) are equal in their ability to shape intentions of actors in networks of associations.

An actor, which can be human and/or nonhuman, is made to act.

This act is always done in relation to other acts.

So actors, together acting, create actor-networks. Humongous, unpredictable webs of associations.

This model of understanding society—through complex, overlapping networks of humans and nonhumans, is very useful for understanding how actors shape each other over social media platforms.

There are countless nonhuman’s working in sprawling actor-networks that shape, structure, and mediate how we interact with others online. Most of this remains hidden within our electronic devices. Strings of codes and wires affect us in ways that are sometimes more meaningful than our interactions with humans.

Some ANT scholars call this the blackbox—where a vast actor-network becomes so stable it is seen to act together and it becomes a node in other, more expansive actor-networks. Think about how many blackboxes exist in the many components of your mobile devices.

I would go as far to say that most Tinder users are unaware how much their romances are mediated by algorithms. These algorithms are typically trade secrets and the musings of Science and Technology scholars—not quite specters in the imaginations of users.

On the social construction of love and intimacy

Love is not universal, it is a complex set of changing associations and psycho-socio-technical feelings that are tied to space, time and historical circumstance.

Anthony Giddens, in his book The Transformation of Intimacy, demonstrates the social and cultural influences that actively construct how we understand, perceive, and pursue, love and intimacy.

In the past, the pre-modern, relationships were forged through arranged marriages that were mainly tied to economic priorities. This is something that Foucault, in his work The History of Sexuality, called the deployment of alliance—in other words, the creation of social bonds and kinship alliances that shaped the distribution of wealth and ensured group and individual survival.

In a complicated move that I am unable to detail here—the deployment of sexuality emerged that changed the very nature of the family. It also led to the love and intimacy that we know today—a social bond that is not necessarily tied to the family unit.

Gidden’s follows, as Western societies began to experience the characteristics of modernity—intimacy and love began to emerge alongside it. Romance emerged when the family institution of premodernity collapsed.

Why am I talking about this? I am talking about the work of Giddens and Foucault in order to illustrate the love and intimacy are a constantly shifting social and cultural construct. Every generation experiences love and intimacy differently.

This is probably the reason why your parents scowl about your one night stands and the fact that you haven’t given them grandchildren yet.

Love and intimacy is going through a massive shift right now.

‘Software mediated intimacy’—algorithmic love

In his essay A Collective of Humans and Nonhumans, Latour speaks about our progression into modernity as a deepening intimacy between humans and nonhumans. An integration between humans and their technology. A folding of all sorts of intentions and agencies.

In this case, as humans become further enveloped in their associations with mobile devices and ubiquitous social media. Most of our interactions become mediated through strings of codes, complex algorithms and digital hardware.

In the case of Tinder—the algorithms are mediating and structuring how we meet and communicate with people who may become love interests. Of course, these algorithms don’t exactly control such love. Human actors enter into associations with the Tinder actor-network with all kinds of intentions and expectations.

Some actors want a fling, a one night stand. Others might want a long-term relationship. Others may even just want attention or friendship.

So even though Tinder is sorting who gets to talk to who—a heterogeneous and constantly shifting clusters of intentions are constantly shaping how Tinder-based relationships will turn out.

And it’s not perfect. Love and intimacy falter and wilt just as much as it blossoms.

But that’s not my point. ‘Software mediated intimacy’, like all of the forms of before it, is another form of social and culturally curated love and intimacy. So it’s not authenticity that is the primary question here. All forms of love emerge from some degree of construction or performance. There is no universal standard.

However, what is different and what is often not seen is the degree to which love can be engineered. Computer scientists and engineers, at the beckoning of large (or small) corporations, embed particular logics and intentions into the algorithms they construct.

As Dodge and Kitchin remind us, in their book Code/Space, this is not a perfect product of social control—but a constant problem to be shaped. Even so, it is disconcerting that so many users are being shaped by silent human and nonhuman mediators.

Tinder’s algorithm and the ‘Elo Score’ is a trade secret. Not opened up to the public. Or the academe. So I am left scrambling for an answer that only raises more questions.

‘Software mediated intimacy’ can offer us a new and novel way to construct relationships, whether they are temporary or sustained. It is a form of social interaction that is just as authentic as prior forms of love and intimacy. However, the codes and algorithms and the complexities of computer science and statistics make it overwhelmingly easy for corporations to shape users based on private interests.

We must learn how these processes work and how they might be democratized so that the user may benefit as much as the capitalist who produces and hordes such software. We must embrace ‘software mediated intimacy’, in order to learn about it and master how it works so we might sidestep the potential and precarious exploitation seeping through our engineered social bonds.

Note: This blog post will be followed by a more in-depth analysis of this understanding of Intimacy. I am seeking to expand this into a larger, theoretical project. Any comments, criticism, or thoughts would be incredibly useful.

Visibility and Exposure at the Margins

I had a recent run in with the public eye because of an op-ed article I wrote for the Queen’s Journal on a controversial exploration of the limits of free speech in terms of the ability of the (in)famous Queen’s Alive (anti-abortion) group to table and canvass for supporters on campus.

This lead to a public debate and lots of discussion. It also lead to a ton of mudslinging and attempts at public smearing.

I had also experienced this in the past when myself and another advocate for queer rights filed a human rights complaint against a magazine for a publishing an unsavory article illustrating a scathing hatred for queer folk (they referred to us as evil and pagan). I was waist deep in understandably complex, multidimensional and very contested discourse. These discourses led to unpleasant hate messages and full transparency in the provincial (and to some degree, international) media.

That is not the topic of this blog post. What I would like to discuss isn’t the status of free speech or the unpalatable existence of hate speech. Rather I am interested in the intense visibility that activists (and others) are exposed to through unpredictable new media interactions. These interactions are typically escalated and amplified by the Internet. This is a dimension of contemporary surveillance not conventionally covered by many academics. It is the subjects of surveillance that I would like to explore.

Continue reading Visibility and Exposure at the Margins

Digital Folklore: A mess of mass culture or valuable cultural artifacts?

folklore

I find myself constantly confronted by a mentality that artifacts of popular culture are inferior to those strong pillars of Western intellectual culture (Shakespeare and Gatsby). I’ve encountered this from peers and mentors, classwork and coffee shop discussions. In this blog post, I’m going to challenge these common assumptions that position popular culture as something less than intellectually stimulating—or worse yet, mere entertainment. I am not trying to say that other forms of art and creation (“high culture”) are bad, I quite enjoy Shakespeare and Gatsby. But I also love Star Wars, Rick and Morty and the Hunger Games. The Internet, and other developments in digital technology, has allowed for the proliferation of popular culture. The Internet and computer software has provided affordable mediums and methods for all kinds of people to create “things”. All kinds of “things”! Memes, amateur YouTube videos, blogs, creepypasta (amateur scary stories), and enormous catalogues of emotional responses in the form of animated GIFs.

tumblr_ndqi9o81kk1svxaato1_400

This is folklore. The study of the culture of the everyday life of everyday people. Lynne McNeil, a folklorist, recently gave a Ted Talk (TEDxUSA) on digital folklore and new media. She heralded the Internet as the perfect archive of everyday life.

McNeil observes,
“Folklorists, unlike literature scholars, or art historians, or music scholars, we don’t look to the productions of the rare geniuses of human kind as the only cultural products worth paying attention too. We look to other kinds of cultural productions, productions that I think make the state of our digital lives seem a little less dire… The problem is with the assumption that the collective works of Shakespeare is the only valid cultural output…”

Through studying, interpreting and understanding folklore, or the stuff and knowledge of everyday life, we get a pretty good illustration of how people interpret and understand the world around them. This is important for all kinds of reasons.

Brenda Brasher (1996), in her work titled, ‘Thoughts on the Status of the Cyborg: On Technological Socialization and its link to the religious function of popular culture’, observed that people are shifting from using religion to generate an understanding of ethics in everyday life to that of popular culture. In this sense, there are more and more people who are interpreting ethics through the Jedi philosophy of the Force than that of the bible. People construct complicated pastiches (or collages) of raw pop cultural data to construct their belief systems. Snippets of ethics and norms taken from Hollywood blockbusters, 4chan, YouTube series, and an ungodly number of video games.

ufo meme
To ignore popular culture is to ignore this massive shift in how people understand the world around them. A great example of the power of vernacular popular culture and folklore is video games. To be frank, though there are amazing and powerful pillars of literature—I find myself struck with an overwhelming sense of catharsis when I play through a well-constructed video game. I’ve had oodles of discussions with friends who are willing to bracket off video games as an intellectual waste of time. However, such cultural artifacts are important to the aspiring digital folklorist just because so many people play them. Furthermore, so many people code them as well. Gamer culture becomes a myriad of professional and independent games.

Just to demonstrate this, here are some statistics presented by the Entertainment Software Association at E3 (a big gaming conference) in 2015. 42% of Americans play video games regularly (at least 3 hours a week). The average age of a gamer is above the age of 35 (so video games cross generations). Gamers consume more games than they do TV and movies. Consumers spent a grand total of 22.41 billion dollars in America in 2015. Video games are big! Lots of people use them, identify with them, and generate cultural groups around them. This is an eye opener for a folklorist, it should certainly be an eye opener for other social scientists.

Trevor Blank, a digital folklorist, observed in his introduction to digital folklore that “It bears noting that the fear of cultural displacement via mass culture is mothering new” (3). He demonstrates that following the innovation of some new form of media, cultural pundits criticized emerging technology as destroying traditions and communication. They accused technological innovations of destroying the folk. However, another perspective of framing the changes brought about with these new forms of media is that they entailed new forms of folk. A problem with framing the media in overtly dystopic ways is that you create a technological determinism that takes agency (choice) from those who participate in everyday life. These critics actually ignore the “folk” (and their practices) in their criticism. The vernacular has not disappeared into the heterogenous mess of “mass” culture—it has changed form.

Blank explains, “New media technology has become so ubiquitous and integrated into users’ communication practices that it is now a viable instrument and conduit of folkloric transmission…” (4).

Folklore is the study of everyday life. The digital has become a realm of everyday life. Cyberspace is just as important as actual space to the emerging generations of humans in consumer societies. From the rich and poor, men and women (and trans* and queer), and all races and ethnicity use the Internet in their everyday lives for all kinds of reasons. Popular culture in this context provides us with valuable new social contexts to study. New gateways into understanding human culture, society, and communication.

tumblr_nt82ai8wnM1s4vi02o1_540


REFERENCES:

Blank, Trevor. 2012. “Introduction.” Pp. 1-24 in Folk Culture in the Digital Age: The Emergent Dynamics of Human Interaction, edited by Trevor Blank. Logan: Utah State University Press.

Brasher, Brenda. 1996. “Thoughts on the Status of the Cyborg: On Technological Socialization and Its Link to the Religious Function of Popular Culture”. Journal of the American Academy of Religion 64(4). Retrievieved December 28, 2015 (http://www.jstor.org/stable/1465623?seq=1#page_scan_tab_contents) .

McNeill, Lynne. 2015. “Folklore doesn’t meme what you think it memes.” YouTube Website. Retrieved December 28, 2015 (https://www.youtube.com/watch?v=PBDJ2UJpKt4&feature=youtu.be).

Social Media: Moving beyond the Luddite trope

Social media is neither good nor bad, though this doesn’t mean it’s necessarily neutral as it certainly has the potential to exploit and empower. Nicole Costa’s rendition of her experiences and tribulations with Facebook in her recent article My online obsessions: How social media can be a harmful form of communication were incredibly touching. Her refusal and resistance to appearing and contributing to the Facebook community is empowering. However, I believe it is also misleading. Social media and digital exchange and interaction are here to stay (save for some cataclysmic event that knocks out the electrical infrastructure) and because of this I believe that we need to learn how to engage with it productively and ethically. We need to engage with social media in a way that doesn’t jump straight into a moralizing agenda. By this I mean illustrating social media as the savior of humanity or a dystopian wasteland where people’s communication collapses into self-absorbed decadence.

How do we maneuver this politically charged land mine addled cyberspace? First we need to recognize that a great number (in the billions) of the human race use social media (of all sorts) for many reasons. However, this is far too broad, let’s focus on Facebook. Facebook is among the most popular of social media with over 1.5 billion users and growing. It is built into the very infrastructure of communication in the Western world. If you have a mobile phone, you very likely have Facebook. You might even use Facebook’s messenger service more than your text messaging. Facebook allows us to share information, build social movements, rally people together in all sorts of grassroots wonders. As an activist, I’ve used Facebook to run successful campaigns. Why? Everyone uses it, and because of this, it has the power (if used correctly) to amplify your voice. Facebook, and most social media, can be very empowering.

But hold your horses! Facebook is still terrifyingly exploitative. Their access to your personal and meta data is unprecedented. Furthermore, they actively use the data that you give them to haul in billions of dollars. Issues of big data and capitalism are finally coming to the forefront of academic and popular discussion, but the nature of such complicated structures are still shrouded in obscurity. The user sees the interface on their computer monitor. But Facebook sees electronic data points that represent every aspect of the Facebook user(s) in aggregate. Through elaborate surveillance techniques, these data points are collected, organized, stored, and traded on an opaque big data marketplace. Furthermore, the user is not paid for their (large) contribution to the product being sold. They are exploited for their data and their labour—as everything you do on Facebook is a part of the data that is commodified and sold.

At the same time Facebook (and other prominent social media platforms) allow for an unprecedented freedom and speed of communication. They have been embedded into our everyday ways of socializing with each other. New social media have become an invaluable and ubiquitous social resource that we engage in from the time we wake to the time we sleep. It has been used to organize events, rallies and protests. It is used to keep in touch with distant family and friends.  It is used for romance, hatred, companionship, and debate. Facebook is playful and empowering.

So if you are like me than you may be absolutely confounded on how to resolve the tensions between Facebook (and other social media) being at the same time exploitative and empowering. We have gone too far down the rabbit hole of social media and digital communication to merely refuse to use it. It is now a intimate part of our social infrastructure. Those who resist through refusal may find themselves at multiple disadvantages in how they engage with the world. My own ethnographic research into why users refused Facebook illustrated that those who abandoned Facebook may have felt empowered by overcoming the “addiction” of social media, however, they also felt excluded and alone. And it must be noted that mostly everyone I talked to who had quit Facebook are now using it again. So clearly, refusal to use these services is not enough to meaningfully challenge problematics in social media.

The Luddites historically were textile workers who were opposed to the invasion of machines into their workplace. Machines that they figured would gouge away at their wages. Today, it is a term used for those who refuse to use certain technologies. In the realm of social media, a Luddite resistance has proved to be incredibly ineffective. It is also important to note that this sort of refusal obscures ways of meaningfully resisting mass surveillance and the exploitation of user data.

I propose the complete opposite. I propose the path of knowledge. We need to learn how to maneuver through social media and the Internet in ways that allow us access to anonymity. Ways of asserting our right to anonymity. This is critical. We need to mobilize and teach and learn through workshops. We need to scour the Internet for free resources on the technical perspectives of social media. We need to also spread awareness of this double edged nature of social media. It is no use to take a stance of refusal, to ignore the importance of social media, and thus remain ignorant to how it all works. When we do this, we actually empower these large capitalist corporations to exploit us that much more. The less we know about the calculus of social media and how it works on a level of algorithm, code and protocol, the more able the capitalists are at disguising and hiding exploitation.

Science Fiction, Mixed Media, and Surveillance

For those of us who have been reading science fiction for some time now—it becomes clear that SF has a strange propensity to becoming prophetic. Many of the themes in science fiction classics are now used as overarching metaphors in mainstream surveillance. Most notably among these is: Orwell’s Big Brother, Huxley’s Brave New World, and Kafka’s Trail. Other common tropes we might refer to is Minority Report, Ender’s Game, and Gattaca.

Though I am not trying to claim that these classics aren’t good pieces of SF literature, they may not do a superb job of covering issues implicit in contemporary surveillance. Imagine George Orwell coming to the realization that the Internet is one humungous surveillance machine with the power of mass, dragnet surveillance. Or imagine Huxley’s reaction to the lulling of consumer affect through branding and advertisement. The power of surveillance tools to control and shape large populations has become a prominent and dangerous feature of the 21st century.

As Richard Hoggart says,

“Things can never quite be the same after we have read—really read—a really good book.”

So let’s stop recycling old metaphors (if I read another surveillance book that references Big Brother or the Panopticon I’m going to switch fields). Let’s look at the work of our own generation of writers and storytellers. What I think we might find is a rich stock of knowledge and cultural data that could illuminate some optics into our (post)human relationship with advance technology.

The reason why I am using mixed media, as opposed to focusing on a singular medium, is that I believe that our relationship with media is not limited to one or the other. Novels, movies, video games, graphic novels and YouTube videos all offer us something in terms of storytelling. Part entertainment, part catharsis premised and constructed through the engagement with the story.  Our generation of storytelling has shifted into the realm of mixed media engagement.  What follows are some stories that I think are critically important to understanding the human condition in our own generational context.

P.S. They are in no particular order.

Disclaimer: Though I tried to be cautious not to forfeit any critical plot or character points, be careful for spoilers:

SOMA 

   SOMA1

SOMA is a survival horror video game released by the developers of Amnesia (another terrifying game), Frictional Games. It is a 2015 science fiction story that both frightens you and an imparts an existential crisis as you struggle to find “human” meaning between the fusion of life and machine. After engaging in a neurological experiment, the main protagonist Simon Jarrett, wakes up in an abandoned underwater facility called PATHOS-II. As opposed to people, Jarrett finds himself trapped with the company of both malicious and benevolent robots—some who believe they are human. The interesting overlap with surveillance here is the focus on neurological surveillance. Scientists (in and out of game) transform the biological brain into a series of data points that represent the original. From this, scientists hope to predict or instill behavior. Or in the case of this game, transform human into machine. This is done by literally uploading the data points of the brain in aggregate to a computer. The game instills a constant question: is there any difference between human consciousness and a copy of human consciousness? SOMA is more than just a scary game—it is a philosophical treatise on the post-human illustrated through an interactive story.

Ready Player One

Readyplayerone

Ready Player One, is a novel written by Ernest Cline, which covers a wide breath of themes: notably the uneasy relationship between surveillance and anonymity, visibility and hiding. Cline constructs a world that doesn’t seem very far off from our own. A world where people begin to embrace simulation through virtual reality (VR) as environmental disaster plagues the actual world. People hide in the sublime. The VR game, OASIS, a world of many worlds, is the home of many clever pop culture references. Mostly music, video games and movies. With an extra emphasis on science fiction. Embedded in this world of worlds is several “Easter Eggs” (surprises hidden in videogames) that act as a treasure trail to the OSASIS late founder’s fortune and ultimate control over the virtual world. Anonymity is the norm of OASIS—a utopian world where the original, democratic ideal of the Internet is realized. A place where anyone can be anybody—without reference to their actual identity. However, this world is jeopardized as a the corporation Innovative Online Industries is also searching for the Easter Eggs to take over OASIS and remake it to generate capital. The theme of anonymity vs. mass surveillance for profit is arguably a major fuel for global debate as all “places” of the Internet are surveilled in increasingly invasive ways. Anonymity has almost disappeared from the Internet, to be replaced with quasi-public profiles (Facebook and Goggle+) that exist to make billions of dollars off of people’s identities and user-generated content. The original dream of the Internet, sadly has failed.

Nexus

RN_rebrand_NEXUS_03-tiny-233x400

Nexus is a science fiction novel written by Ramez Naam following characters who are engaged with a new type of “nano-drug” that restructures the human brain so that people can connect mind to mind. There are those who support the drug and those who are against it. This conflict is followed by a slurry of espionage that exposes the characters to incredible dangers. The theme of surveillance in Nexus follows a new fixation on neuroscience. The ability to surveil the very essential, bio-chemical features of the human mind. As well as exposing mind and memory to others participating in this new psychedelic (psychosocial) drug. This is a level of exposure that far supercedes our experiences with the Internet and social media. Imagine being hardwired into a computer network. The book also follows traditional surveillance themes as the main character Kaden Lane becomes entangled in the conflict of private corporations and state government.

The Circle

The_Circle_(Dave_Eggers_novel_-_cover_art)
Social media in the 21st century has positioned Western society within the context of visibility and exposure. Most people are simultaneously engaged in self-exposure and participatory surveillance—as we post content about our lives and browse and read content about the lives of our friends and family. The Circle by Dave Eggers works this theme through a character, named Mae Holland, who has just been hired by the world’s largest IT company located in a place called the Circle. The Circle is a place, much like a University campus, with literally everything on it. This place boarders utopia—a place where work and play blends. However, following the mantra “All that happens must be known”, social media penetrates the lives of those who exist in the Circle in pervasive and exposing ways. Very quickly, the utopic illusion slips away into dystopia.

Slenderman

slide

Slenderman was, in its bare skeleton form, introduced to the Internet by Eric Knudson on the (in)famous Something Aweful forum board for a paranormal photo editing contest. However, within a year, Slenderman was sucked into a collective narrative construction across all media platforms. People blogged about it, tweeted about it, YouTubed about it. A massive and ever changing (and unstable) urban legend (or Fakelore) was constructed in the chaos of cyberspace. Slenderman, the paranormal creature, can be described as a tall man with unnaturally long arms and legs (and sometimes tentacles), wearing a black suit, with no face. It is usually depicted as a creature who watches, in other words surveils. It watches from obscure areas, slowly driving its victim to paranoia and insanity. Than the victim disappears, without a trace. Slenderman is the contemporary boogieman. But it also shares a narrative with dangerous, obscure, and mysterious secret police and intelligence agencies. As Snowden revealed to the public, governments, through mass surveillance techniques, watch everyone and everything. Could the slenderman narrative be telling of a deep seeded cultural fear of government surveillance in the 21st century? There are many ways to tap into this story—google blogs, tumblr accounts, and twitter accounts. But also, YouTube series’ like Marble Hornets, EverymanHYBRID, and Tribe Twelve. Also check out the genre called Creepypasta for an extra home brewed thrill.

 

YouTube Red: Google and the double exploitation

youtube_red_brandmark

It was recently announced that YouTube, owned and operated by Google, is planning on releasing a paid subscription service. This would entail a prioritizing of services to those who are able to afford it and creating exclusive content for those who are willing to pay. This is all kinds of messed up—but the most nefarious aspect of this is that they are already making money off of you.  Google uses you much like an employee (though unpaid). All of the content you generate, use, or provide “free” to Google, they organize and trade through complicated surveillance systems to swing a profit off of surplus value. This is why services like Facebook, Twitter and YouTube are free. They are funded (and make ludicrous profits off of) your personal information.

Reading about Youtube Red prompted me to explore some of Google’s Privacy Policy to understand how Google uses our information to generate a profit. I’d like to note that Google owns a whole lot of Internet applications we tend to use in our everyday life. YouTube is only one of these applications, though a really important one.  These policies are attached to a good many things we do on the Internet.  Though the policies are provided to the user in a way that paints Google as a benevolent partner in your access to good services and relevant advertisements—the truth is that the website profits greatly off the information you provide them. This may seem very obvious—but I think we need to recognize that this definitely changes the face of Google’s intentions. They effectively disguise any exploitative functions of their information use through flowery language. An illustrative example of this is how they cleverly change ‘trading’ information to ‘sharing’ information.  The use of the word ‘sharing’ implies that information is given as a ‘gift’, but it also evokes good feels about Google’s intentions.

An interesting power we grant Google through the Terms of Use is that they have agency over the use of the content we upload, despite saying that we retain ownership of such content. The policy reads:

“When you upload, submit, store, send or receive content to or through our Services, you give Google (and those we work with) a worldwide license to use, host, store, reproduce, modify, create derivative works (such as those resulting from translations, adaptations or other changes we make so that your content works better with our Services), communicate, publish, publicly perform, publicly display and distribute such content.”

They use complicated and automated means of surveillance in order to collect, organize, and monetize your data. They also are free to make use of your user-generated content—things you created with your time and effort, though you are not paid for this.  Regardless of how you understand your relationship with Google, you should understand that the relationship is framed in a Capitalistic system. You are a Google piggy bank.

The concept of the cyber prosumer is discussed by many political economists and surveillance theorists. Cohen (2008) introduces the concept of prosumer into her work on surveillance and Facebook. This concept can be used for any Web 2.0 social media application (Facebook, Twitter, Tumblr, etc.). It is most certainly a part of Google’s political economic structure. Cohen observes, “Business models based on a notion of the consumer as producer have allowed Web 2.0 applications to capitalize on the time spent participating in communicative activity and information sharing” (7). To call a social media user a prosumer is to say that they both produce and consume simultaneously while using Google services. They produce the user-generated content that is then sold to advertisers and used to target advertisements back at the prosumer.

In the process of Google capitalizing off this user-genreated content the prosumer is involved in ‘immaterial labour’. This is a concept devised by Lazzorato (1996) to talk about the informational and cultural aspects of labour exploitation. Though the Internet looked far different in the 90s, his concept has become even more valuable with the advent of social media. Lazzorato (1996) elaborates that immaterial labour is “the labour that produces the informational and cultural content of the commodity” (1). He breaks this concept down to two components: informational content and cultural content (ibid 1). Informational content refers to the shift from physical labour to labour organized by computer and digital technology (ibid 1). Cultural content refers to the production of creative and artistic artifacts that were never (and still aren’t) considered in the realm of labour (ibid 1).

This concept is incredibly useful for understanding the role of social media in capitalism—as immaterial labour, often expressed as the realm of fun and social, becomes the unrecognized exploitation of users as corporations utilize their creative potential for capital gain. Bauman and Lyon (2013) express, “The arousing of desires—is thereby written out of the marketing budget and transferred on to the shoulders of prospective consumers” (125). Though it is to be noted that this use of immaterial labour can be said to be a fair trade-off for free use of Google’s services.

The troublesome part of all of this is that if they begin to charge for subscription fees for better services (preferred services) it will take on a doubling effect of exploitation. First, the prosumer engages in immaterial labour through the creation of user-generated content that Google consolidates to produce surplus value from thus generating profit. And then, the prosumer is charged a subscription fee for use. In terms of labour, you will essentially have to pay to provide Google with the fruits of your labour.

What may be even more troubling is if Google is allowed to succeed with the implementation of YouTube Red than it will likely provide incentive for other social media sites, such as Facebook, to do similar things.  This is a conversation we should not take lightly.  Surveillance might have its benefits to society, but when used by social media sites through the capitalist framework, two issues come to mind: exploitation and control.  We need to take a critical stance on this or we might slip down the slippery slope of subscription social media.


Bauman, Zygmunt and David Lyon. 2013. Liquid Surveillance. Cambridge: Polity.

Cohen, Nicole S. 2008. “The Valorization of Surveillance: Towards a Political Economy of Facebook.” Democratic Communique 22(1):5-22.

Lazzarato, M. 1996.  ‘Immaterial Labour.’ Generation Online. Retrieved November 5, 2015 (http://www.generation-online.org/c/fcimmateriallabour3.htm).

The #poliecon of Social Media and Surveillance: They are watching you watch others.

 

tumblr_nslxha8E9t1urwr46o1_500
Layered surveillance through mass media. GIF created by Kotutohum. Find their tumblr blog here: http://kotutohum.com/

I suppose I should begin with a (very) brief introduction to the study of political economy (from the novice perspective) and then draw out its many connections to how we exchange and produce (big)data through our use of social media (Facebook, Instagram, Twitter, Tumblr, etc.). As far as the development of poliecon in the social sciences is concerned—we begin with Hegel and Marx/Engels. So prepare your head for a quick review of the history of humanity. Ready? Go!

Hegel developed the philosophical concept of dialectics in history. He idealized history as the production of knowledge (thesis) that was then challenged by another form of knowledge (antithesis) and through conflict and debate formed a new body of knowledge (thesis). Dialectics would continue to cycle like this in a back and forth tension between bodies of knowledge until we reached the pinnacle of all knowledge—the perfect society. The notion of a “perfect society” is very well challenged in our era of academic thought. However, this inspired Karl Marx to (dialectically) approach the development of historical materialist methodology which featured dialectic thought in a more empirical fashion (the development of these thoughts led to a fissure in academic thought between the idealists (Hegelian) and the materialists (Marxist).

Karl Marx grounded his research into the development and history of capital (and capitalism). Through his empirical studies he theorized that the mode of production was the foundation of (just about) everything in society. This was the material base from which the superstructure arises. The superstructure is the heterogeneous masses of ideological thought (politics, law, social relations, etc.). It is from the superstructure, which is coordinated by the mode of production (and some argue, mode of exchange), that we get the (unstable and constantly changing) understanding of value. Furthermore, if the mode of production were to change (as it is certainly done in this case), the superstructure would change, along with the meaning of social relations and formations.  It is from this conception of value, as understood by political economy, that I want to spring from to understand how we exchange (big)data through the use of social media. I will use Facebook as the overarching example, because at this point, we all have an intimate knowledge of Facebook. Also, Facebook owns other social media platforms (such as Instagram). It is certainly the current largest social network site.

In order for the entire architecture (both physically and digitally) of Facebook (and other forms of social media) to exist there needs to be value generated for information (big data). Facebook is a capitalistic enterprise that seeks to generate profit from such information. Because of this, Facebook works to proliferate and expand its user base.  The more Facebook’s user base proliferates, the more data they have to draw from.  I am going to highlight that Facebook achieves all of this through two fundamental forms of surveillance: participatory surveillance and capital surveillance.

First value must be generated. Value is generated for big data through its production and consumption. Before we can understand how value is created we need to talk about the prosumer. In the context of Facebook, the user produces and consumes the user-generated content and metadata that is then used as big data in aggregate. So essentially, producer and consumer are collapsed into the user prosumer (Cohen 2008:7). Value is generated because the fruits of the prosumer—data through biography, interaction, and Internet usage—are sold to advertisers who then feed it back into the system as targeted advertisements. According to Fuchs (2012), the prosumer is packaged, commodified and sold (146).

Fuchs observes,

“Facebook prosumers are double objects of commodification. They are first commodified by corporate platform operators, who sell them to advertising clients, and this results, second, in an intensified exposure to commodity logic. They are permanently exposed to commodity propaganda presented by advertisements while they are online. Most online time is advertisement time” (146).

This is obviously problematic. I think it is also pretty important that we acknowledge that the role of prosumer positions the Facebook user as a free labour commodity. Cohen (2008) asserts, “Web 2.0 models depend on the audience producing the content, thus requiring an approach that can account for the labour involved in the production of 2.0 content, which can be understood as information, social networks, relationships, and affect” (8). In this process of production, Facebook repackages user-generated content and sells the data to generate intense profits (in the billions range). The user prosumer remains unpaid in this exchange. Interestingly enough, through my own work in qualitative research, those who participated in my research believed that use of Facebook’s services qualified as a fair exchange for their data. I think an apt thread of thinking that could resolve these problems, van Djick (2012) observes, “Connectivity is premised on a double logic of empowerment and exploitation” (144). With this noted, I would like to focus on the production, consumption and monetization of user-generated content.

The content produced and consumed by the user prosumer is organized through two layers of surveillance. The first layer of surveillance, is participatory surveillance. Albrechtslund (2008), in trying to address the overwhelming dystopic metaphors implicit in the discourse and study of surveillance, he explains that use of hierarchical models of surveillance (like the big brother and panopticon) obscures important sociological processes that occur through the mediation of social media (8).  Furthermore, it treats users as passive agents, unable to resist the oppressive and repressive forces of the Big Brother.  He attempts to frame surveillance as a mutual, horizontal process that empowers users through the sharing of information and creation of elaborate autobiographies. Albrechtslund elaborates that social media offer, “new ways of constructing identity, meeting friends and colleagues, as well as socializing with strangers” (8). In this understanding of social media, the subject is not a passive agent under the oppressive gaze of big brother, but an active subject pursuing empowerment. Furthermore, Albrechtslund frames user-generated content specifically as sharing, not trading. However, in doing this, he ignores that these social media platforms are constructed, shaped and owned by capitalist corporations seeking profit. This is where the second layer of surveillance becomes important—capital surveillance.

During the process of the user prosumer engaging in participatory surveillance, or in other words producing and consuming user-generated content that they share with others, the capitalist captures that data and repackages it to be sold to advertisers. They do this through complicated algorithmic computer software which than stores the data in a large architecture of computer hardware, optic wires, and servers. The fruits that become available through participatory surveillance are commodified (along with the prosumers) and then traded to produce capital. This layer, the hierarchical and oppressive model of surveillance, organizes and shapes how user prosumers generate content. Thus van Djick’s concept of the double logic of connectivity is realized. What is problematic here is that much of capital surveillance is rendered opaque or invisible to the user—who only sees the participatory aspects and the advertisements (repackaged user-generated content).  Also problematic, is that this entire process is automated–though this note will not be taken up in this article.

It is important to note that participatory surveillance is not typically a capitalist endeavour. Cohen writes, “The labour performed on sites like Facebook is not produced by capitalism in any direct, cause and effect fashion… (it is) simply an answer to the economic needs of capital” (17). So where the user prosumer “shares” their production of user-generated content, the capitalist “trades” it. These are two interconnected, though fundamentally different, processes. We, the user prosumers, don’t often recognized the capital forms of surveillance occurring, because we are so intimately involved in the participatory forms of surveillance. This, I believe, is the root to our apathy about the surveillance issues surrounding social media like Facebook. What needs to be devised next is how we can package these theories in a popular form and export them to those who are shaped by these forms of exploitative commodification. It is the work of social scientists to understand, and then to shape, the world around them.

tumblr_mgumbzX8eb1qap9uuo1_500
Big brother is watching you watch others. GIF created by Kotutohum. Find their tumblr blog here: http://kotutohum.com/

Post-script:

Another lesson we should take from this is that not all surveillance is evil.  We do not live in an inescapable dystopian society.  To say this, we obscure a lot of actual practices of surveillance that are beneficial.  We also render the notion of resistance as a practice in futility.  Surveillance is a neutral phenomenon that is used for better or worse by a plethora of different corporations, governments, non-governmental organizations, activists, and regular everyday people.  But in saying this, we can’t ignore the potential abuse and exploitation that may come from the use of surveillance practices to increase the flow of Capital.


REFERENCES:

Albrechtslund, Anders. 2008. “Online Social Networking as Participatory Surveillance.” First Monday 13(3). Retrieved Oct 9, 2015 (http://journals.uic.edu/ojs/index.php/fm/article/view/2142/1949).

Cohen, Nicole S. 2008. “The Valorization of Surveillance: Towards a Political Economy of Facebook.” Democratic Communique 22(1):5-22.

van Dijck, José. 2012. “Facebook and the engineering of connectivity: A multi-layered approach to social media platforms.” Convergence: The International Journal of Research into New Media Technologies 19(2):141-155.

Fuchs, Christian. 2012. “The Political Economy of Privacy on Facebook”. Television & New Media 13(2):139-159.