Tag Archives: Capitalism

‘Software Mediated Intimacy’ in Tinder: have we ever been in love?

Tinder has become an almost ubiquitous dating app that has built quite a lot of controversy even as it sits in most of our mobile phones.

While users swipe left and right searching for people they might be interested in, questions of legitimacy and authenticity in love, intimacy and dating arise. I was motivated to write this blogpost after a debate with a great friend of mine about the authenticity of love in Tinder.

Questions arose for me: If love is a social/cultural construct, is it authentic? What does Tinder do to change how love manifests itself? What has changed that we can’t right away see?

These tensions hang about within an atmosphere of heavy technical mediation that silently organizes, classifies, and structures how users interact with each other.

In an article by Fast Company, Austin Carr had the wonderful opportunity to interview Tinder CEO Sean Rad about the algorithm that measures user desirability and matches users to their equivalents. This is the ‘Elo score’ program.

Tinder users do not have access to their desirability rating, however, this rating does predetermine who gets to see who. It brings together a large and heterogeneous set of data on users in order to measure their desirability. Users only have access to viewing profiles that share a similar scoring.

Algorithms are active mediators in the construction of networks of intimacy. I would like to call this ‘software mediated intimacy’. Before we explore this concept we need to understand the basics and histories of two concepts: actor-networks and intimacy.

On actors, networks and the silent power of algorithms

Actor-networks are a set of conceptual tools originally devised and made popular by science and technology studies scholars Latour, Callon, and Law.

The most fundamental distinction of this school of thought, sometimes referred to as actor-network theory (ANT) is the principle of generalized symmetry. This concept holds that humans and nonhumans (atoms, systems, computers and codes) are equal in their ability to shape intentions of actors in networks of associations.

An actor, which can be human and/or nonhuman, is made to act.

This act is always done in relation to other acts.

So actors, together acting, create actor-networks. Humongous, unpredictable webs of associations.

This model of understanding society—through complex, overlapping networks of humans and nonhumans, is very useful for understanding how actors shape each other over social media platforms.

There are countless nonhuman’s working in sprawling actor-networks that shape, structure, and mediate how we interact with others online. Most of this remains hidden within our electronic devices. Strings of codes and wires affect us in ways that are sometimes more meaningful than our interactions with humans.

Some ANT scholars call this the blackbox—where a vast actor-network becomes so stable it is seen to act together and it becomes a node in other, more expansive actor-networks. Think about how many blackboxes exist in the many components of your mobile devices.

I would go as far to say that most Tinder users are unaware how much their romances are mediated by algorithms. These algorithms are typically trade secrets and the musings of Science and Technology scholars—not quite specters in the imaginations of users.

On the social construction of love and intimacy

Love is not universal, it is a complex set of changing associations and psycho-socio-technical feelings that are tied to space, time and historical circumstance.

Anthony Giddens, in his book The Transformation of Intimacy, demonstrates the social and cultural influences that actively construct how we understand, perceive, and pursue, love and intimacy.

In the past, the pre-modern, relationships were forged through arranged marriages that were mainly tied to economic priorities. This is something that Foucault, in his work The History of Sexuality, called the deployment of alliance—in other words, the creation of social bonds and kinship alliances that shaped the distribution of wealth and ensured group and individual survival.

In a complicated move that I am unable to detail here—the deployment of sexuality emerged that changed the very nature of the family. It also led to the love and intimacy that we know today—a social bond that is not necessarily tied to the family unit.

Gidden’s follows, as Western societies began to experience the characteristics of modernity—intimacy and love began to emerge alongside it. Romance emerged when the family institution of premodernity collapsed.

Why am I talking about this? I am talking about the work of Giddens and Foucault in order to illustrate the love and intimacy are a constantly shifting social and cultural construct. Every generation experiences love and intimacy differently.

This is probably the reason why your parents scowl about your one night stands and the fact that you haven’t given them grandchildren yet.

Love and intimacy is going through a massive shift right now.

‘Software mediated intimacy’—algorithmic love

In his essay A Collective of Humans and Nonhumans, Latour speaks about our progression into modernity as a deepening intimacy between humans and nonhumans. An integration between humans and their technology. A folding of all sorts of intentions and agencies.

In this case, as humans become further enveloped in their associations with mobile devices and ubiquitous social media. Most of our interactions become mediated through strings of codes, complex algorithms and digital hardware.

In the case of Tinder—the algorithms are mediating and structuring how we meet and communicate with people who may become love interests. Of course, these algorithms don’t exactly control such love. Human actors enter into associations with the Tinder actor-network with all kinds of intentions and expectations.

Some actors want a fling, a one night stand. Others might want a long-term relationship. Others may even just want attention or friendship.

So even though Tinder is sorting who gets to talk to who—a heterogeneous and constantly shifting clusters of intentions are constantly shaping how Tinder-based relationships will turn out.

And it’s not perfect. Love and intimacy falter and wilt just as much as it blossoms.

But that’s not my point. ‘Software mediated intimacy’, like all of the forms of before it, is another form of social and culturally curated love and intimacy. So it’s not authenticity that is the primary question here. All forms of love emerge from some degree of construction or performance. There is no universal standard.

However, what is different and what is often not seen is the degree to which love can be engineered. Computer scientists and engineers, at the beckoning of large (or small) corporations, embed particular logics and intentions into the algorithms they construct.

As Dodge and Kitchin remind us, in their book Code/Space, this is not a perfect product of social control—but a constant problem to be shaped. Even so, it is disconcerting that so many users are being shaped by silent human and nonhuman mediators.

Tinder’s algorithm and the ‘Elo Score’ is a trade secret. Not opened up to the public. Or the academe. So I am left scrambling for an answer that only raises more questions.

‘Software mediated intimacy’ can offer us a new and novel way to construct relationships, whether they are temporary or sustained. It is a form of social interaction that is just as authentic as prior forms of love and intimacy. However, the codes and algorithms and the complexities of computer science and statistics make it overwhelmingly easy for corporations to shape users based on private interests.

We must learn how these processes work and how they might be democratized so that the user may benefit as much as the capitalist who produces and hordes such software. We must embrace ‘software mediated intimacy’, in order to learn about it and master how it works so we might sidestep the potential and precarious exploitation seeping through our engineered social bonds.

Note: This blog post will be followed by a more in-depth analysis of this understanding of Intimacy. I am seeking to expand this into a larger, theoretical project. Any comments, criticism, or thoughts would be incredibly useful.

Social Media: Moving beyond the Luddite trope

Social media is neither good nor bad, though this doesn’t mean it’s necessarily neutral as it certainly has the potential to exploit and empower. Nicole Costa’s rendition of her experiences and tribulations with Facebook in her recent article My online obsessions: How social media can be a harmful form of communication were incredibly touching. Her refusal and resistance to appearing and contributing to the Facebook community is empowering. However, I believe it is also misleading. Social media and digital exchange and interaction are here to stay (save for some cataclysmic event that knocks out the electrical infrastructure) and because of this I believe that we need to learn how to engage with it productively and ethically. We need to engage with social media in a way that doesn’t jump straight into a moralizing agenda. By this I mean illustrating social media as the savior of humanity or a dystopian wasteland where people’s communication collapses into self-absorbed decadence.

How do we maneuver this politically charged land mine addled cyberspace? First we need to recognize that a great number (in the billions) of the human race use social media (of all sorts) for many reasons. However, this is far too broad, let’s focus on Facebook. Facebook is among the most popular of social media with over 1.5 billion users and growing. It is built into the very infrastructure of communication in the Western world. If you have a mobile phone, you very likely have Facebook. You might even use Facebook’s messenger service more than your text messaging. Facebook allows us to share information, build social movements, rally people together in all sorts of grassroots wonders. As an activist, I’ve used Facebook to run successful campaigns. Why? Everyone uses it, and because of this, it has the power (if used correctly) to amplify your voice. Facebook, and most social media, can be very empowering.

But hold your horses! Facebook is still terrifyingly exploitative. Their access to your personal and meta data is unprecedented. Furthermore, they actively use the data that you give them to haul in billions of dollars. Issues of big data and capitalism are finally coming to the forefront of academic and popular discussion, but the nature of such complicated structures are still shrouded in obscurity. The user sees the interface on their computer monitor. But Facebook sees electronic data points that represent every aspect of the Facebook user(s) in aggregate. Through elaborate surveillance techniques, these data points are collected, organized, stored, and traded on an opaque big data marketplace. Furthermore, the user is not paid for their (large) contribution to the product being sold. They are exploited for their data and their labour—as everything you do on Facebook is a part of the data that is commodified and sold.

At the same time Facebook (and other prominent social media platforms) allow for an unprecedented freedom and speed of communication. They have been embedded into our everyday ways of socializing with each other. New social media have become an invaluable and ubiquitous social resource that we engage in from the time we wake to the time we sleep. It has been used to organize events, rallies and protests. It is used to keep in touch with distant family and friends.  It is used for romance, hatred, companionship, and debate. Facebook is playful and empowering.

So if you are like me than you may be absolutely confounded on how to resolve the tensions between Facebook (and other social media) being at the same time exploitative and empowering. We have gone too far down the rabbit hole of social media and digital communication to merely refuse to use it. It is now a intimate part of our social infrastructure. Those who resist through refusal may find themselves at multiple disadvantages in how they engage with the world. My own ethnographic research into why users refused Facebook illustrated that those who abandoned Facebook may have felt empowered by overcoming the “addiction” of social media, however, they also felt excluded and alone. And it must be noted that mostly everyone I talked to who had quit Facebook are now using it again. So clearly, refusal to use these services is not enough to meaningfully challenge problematics in social media.

The Luddites historically were textile workers who were opposed to the invasion of machines into their workplace. Machines that they figured would gouge away at their wages. Today, it is a term used for those who refuse to use certain technologies. In the realm of social media, a Luddite resistance has proved to be incredibly ineffective. It is also important to note that this sort of refusal obscures ways of meaningfully resisting mass surveillance and the exploitation of user data.

I propose the complete opposite. I propose the path of knowledge. We need to learn how to maneuver through social media and the Internet in ways that allow us access to anonymity. Ways of asserting our right to anonymity. This is critical. We need to mobilize and teach and learn through workshops. We need to scour the Internet for free resources on the technical perspectives of social media. We need to also spread awareness of this double edged nature of social media. It is no use to take a stance of refusal, to ignore the importance of social media, and thus remain ignorant to how it all works. When we do this, we actually empower these large capitalist corporations to exploit us that much more. The less we know about the calculus of social media and how it works on a level of algorithm, code and protocol, the more able the capitalists are at disguising and hiding exploitation.

Snowden visits campus via live feed: NSA whistleblower addresses a packed Grant Hall

 

Queen’s International Affairs Association’s (QIAA) hosted a video conference with Edward Snowden on Thursday in Grant Hall.
Queen’s International Affairs Association’s (QIAA) hosted a video conference with Edward Snowden on Thursday in Grant Hall. Photo: Arwin Chan

Originally appeared in the Queen’s Journal on November 13th, 2015.

“I am just a citizen.  I was the mechanism of disclosure. It’s not up to me to say what the future should be — it’s up to you,” NSA whistleblower Edward Snowden told a packed house in Grant Hall.

Snowden — a polarizing figure globally — was invited as the keynote speaker for Queen’s Model United Nations Invitational (QMUNi) for the Queen’s International Affairs Association’s (QIAA).

As the talk commenced at 6:30 p.m., Snowden was met with applause.

The buzz surrounding Snowden’s Google Hangout talk on Thursday at Grant Hall started early, as crowds started lined up to enter the Grant Hall. The building quickly hit capacity.

Snowden began with a discussion of his motivations to disclose countless NSA confidential documents. He told the audience that he once believed wholeheartedly that mass surveillance was for the public good.

He came from a “federal family”, he said, with relations to both politics and military.  He said once he reached the peak of his career in government intelligence — when he received the highest security clearance — he saw the depth of the problem.

After that realization came the release of classified documents to journalists in 2013, his defection from the NSA and his indefinite stay in Russia.

“Progress often begins as an outright challenge to the law. Progress in many cases is illegal,” he said.

However, he has made himself into more than just a whistleblower. Snowden has continued to push for and encourage discussion about mass surveillance.

“Justice has to be seen to be done,” he said.

“I don’t live in Russia, I live on the Internet,” he said at another point during the talk.

When asked about Bill C-51 — the controversial terror bill in Canada — Snowden said “terrorism is often the public justification, but it’s not the actual motivation” for the bill.

He continued to say that if you strip the bill of the word “terrorism”, you can see the extent to which the bill makes fundamental changes that affect civil rights.

Snowden’s talk was intended to encourage discussion about mass surveillance. QIAA had initially contacted Snowden’s lawyer and publishers, who handle Snowden’s public affairs, and after a long process of back-and-forth negotiations they secured Snowden as a keynote speaker.

Dr. David Lyon, director of the Surveillance Studies Center and author of the recent publication Surveillance After Snowden, acted as the moderator for the talk.

David Lyon, right, mediated Thursday night’s question and answer period with Edward Snowden. (Photo by Arwin Chan)

There were mixed opinions among audience members about Edward Snowden and his mass disclosures of National Security Agency (NSA) intelligence documents to journalists in 2013.

Some students, like Mackenzie Schroeder, Nurs ’17, say Snowden’s actions were gutsy, but had good intentions.

Another guest, Akif Hasni, a PhD student in political studies, said he thought Snowden’s actions were important, despite the problems associated with publishing that information.

Other guests at the event didn’t completely agree with Snowden’s whistleblowing.

“It’s a dangerous thing to tell newspapers about. The thing about guys like Edward Snowden is that no one is going to know if what he did was good, while the action itself may be,” Sam Kary, ArtSci ’15, said.

Kary referred to John Oliver’s Snowden interview, where Oliver highlighted damages to national security caused by careless redacting of leaked documents by The New York Times.

The failure to properly redact leaked documents revealed the name of an NSA agent along with information on how the US government was targeting al-Qaeda operatives in Mosul in 2010.

— With files from Kate Meagher 

YouTube Red: Google and the double exploitation

youtube_red_brandmark

It was recently announced that YouTube, owned and operated by Google, is planning on releasing a paid subscription service. This would entail a prioritizing of services to those who are able to afford it and creating exclusive content for those who are willing to pay. This is all kinds of messed up—but the most nefarious aspect of this is that they are already making money off of you.  Google uses you much like an employee (though unpaid). All of the content you generate, use, or provide “free” to Google, they organize and trade through complicated surveillance systems to swing a profit off of surplus value. This is why services like Facebook, Twitter and YouTube are free. They are funded (and make ludicrous profits off of) your personal information.

Reading about Youtube Red prompted me to explore some of Google’s Privacy Policy to understand how Google uses our information to generate a profit. I’d like to note that Google owns a whole lot of Internet applications we tend to use in our everyday life. YouTube is only one of these applications, though a really important one.  These policies are attached to a good many things we do on the Internet.  Though the policies are provided to the user in a way that paints Google as a benevolent partner in your access to good services and relevant advertisements—the truth is that the website profits greatly off the information you provide them. This may seem very obvious—but I think we need to recognize that this definitely changes the face of Google’s intentions. They effectively disguise any exploitative functions of their information use through flowery language. An illustrative example of this is how they cleverly change ‘trading’ information to ‘sharing’ information.  The use of the word ‘sharing’ implies that information is given as a ‘gift’, but it also evokes good feels about Google’s intentions.

An interesting power we grant Google through the Terms of Use is that they have agency over the use of the content we upload, despite saying that we retain ownership of such content. The policy reads:

“When you upload, submit, store, send or receive content to or through our Services, you give Google (and those we work with) a worldwide license to use, host, store, reproduce, modify, create derivative works (such as those resulting from translations, adaptations or other changes we make so that your content works better with our Services), communicate, publish, publicly perform, publicly display and distribute such content.”

They use complicated and automated means of surveillance in order to collect, organize, and monetize your data. They also are free to make use of your user-generated content—things you created with your time and effort, though you are not paid for this.  Regardless of how you understand your relationship with Google, you should understand that the relationship is framed in a Capitalistic system. You are a Google piggy bank.

The concept of the cyber prosumer is discussed by many political economists and surveillance theorists. Cohen (2008) introduces the concept of prosumer into her work on surveillance and Facebook. This concept can be used for any Web 2.0 social media application (Facebook, Twitter, Tumblr, etc.). It is most certainly a part of Google’s political economic structure. Cohen observes, “Business models based on a notion of the consumer as producer have allowed Web 2.0 applications to capitalize on the time spent participating in communicative activity and information sharing” (7). To call a social media user a prosumer is to say that they both produce and consume simultaneously while using Google services. They produce the user-generated content that is then sold to advertisers and used to target advertisements back at the prosumer.

In the process of Google capitalizing off this user-genreated content the prosumer is involved in ‘immaterial labour’. This is a concept devised by Lazzorato (1996) to talk about the informational and cultural aspects of labour exploitation. Though the Internet looked far different in the 90s, his concept has become even more valuable with the advent of social media. Lazzorato (1996) elaborates that immaterial labour is “the labour that produces the informational and cultural content of the commodity” (1). He breaks this concept down to two components: informational content and cultural content (ibid 1). Informational content refers to the shift from physical labour to labour organized by computer and digital technology (ibid 1). Cultural content refers to the production of creative and artistic artifacts that were never (and still aren’t) considered in the realm of labour (ibid 1).

This concept is incredibly useful for understanding the role of social media in capitalism—as immaterial labour, often expressed as the realm of fun and social, becomes the unrecognized exploitation of users as corporations utilize their creative potential for capital gain. Bauman and Lyon (2013) express, “The arousing of desires—is thereby written out of the marketing budget and transferred on to the shoulders of prospective consumers” (125). Though it is to be noted that this use of immaterial labour can be said to be a fair trade-off for free use of Google’s services.

The troublesome part of all of this is that if they begin to charge for subscription fees for better services (preferred services) it will take on a doubling effect of exploitation. First, the prosumer engages in immaterial labour through the creation of user-generated content that Google consolidates to produce surplus value from thus generating profit. And then, the prosumer is charged a subscription fee for use. In terms of labour, you will essentially have to pay to provide Google with the fruits of your labour.

What may be even more troubling is if Google is allowed to succeed with the implementation of YouTube Red than it will likely provide incentive for other social media sites, such as Facebook, to do similar things.  This is a conversation we should not take lightly.  Surveillance might have its benefits to society, but when used by social media sites through the capitalist framework, two issues come to mind: exploitation and control.  We need to take a critical stance on this or we might slip down the slippery slope of subscription social media.


Bauman, Zygmunt and David Lyon. 2013. Liquid Surveillance. Cambridge: Polity.

Cohen, Nicole S. 2008. “The Valorization of Surveillance: Towards a Political Economy of Facebook.” Democratic Communique 22(1):5-22.

Lazzarato, M. 1996.  ‘Immaterial Labour.’ Generation Online. Retrieved November 5, 2015 (http://www.generation-online.org/c/fcimmateriallabour3.htm).

The #poliecon of Social Media and Surveillance: They are watching you watch others.

 

tumblr_nslxha8E9t1urwr46o1_500
Layered surveillance through mass media. GIF created by Kotutohum. Find their tumblr blog here: http://kotutohum.com/

I suppose I should begin with a (very) brief introduction to the study of political economy (from the novice perspective) and then draw out its many connections to how we exchange and produce (big)data through our use of social media (Facebook, Instagram, Twitter, Tumblr, etc.). As far as the development of poliecon in the social sciences is concerned—we begin with Hegel and Marx/Engels. So prepare your head for a quick review of the history of humanity. Ready? Go!

Hegel developed the philosophical concept of dialectics in history. He idealized history as the production of knowledge (thesis) that was then challenged by another form of knowledge (antithesis) and through conflict and debate formed a new body of knowledge (thesis). Dialectics would continue to cycle like this in a back and forth tension between bodies of knowledge until we reached the pinnacle of all knowledge—the perfect society. The notion of a “perfect society” is very well challenged in our era of academic thought. However, this inspired Karl Marx to (dialectically) approach the development of historical materialist methodology which featured dialectic thought in a more empirical fashion (the development of these thoughts led to a fissure in academic thought between the idealists (Hegelian) and the materialists (Marxist).

Karl Marx grounded his research into the development and history of capital (and capitalism). Through his empirical studies he theorized that the mode of production was the foundation of (just about) everything in society. This was the material base from which the superstructure arises. The superstructure is the heterogeneous masses of ideological thought (politics, law, social relations, etc.). It is from the superstructure, which is coordinated by the mode of production (and some argue, mode of exchange), that we get the (unstable and constantly changing) understanding of value. Furthermore, if the mode of production were to change (as it is certainly done in this case), the superstructure would change, along with the meaning of social relations and formations.  It is from this conception of value, as understood by political economy, that I want to spring from to understand how we exchange (big)data through the use of social media. I will use Facebook as the overarching example, because at this point, we all have an intimate knowledge of Facebook. Also, Facebook owns other social media platforms (such as Instagram). It is certainly the current largest social network site.

In order for the entire architecture (both physically and digitally) of Facebook (and other forms of social media) to exist there needs to be value generated for information (big data). Facebook is a capitalistic enterprise that seeks to generate profit from such information. Because of this, Facebook works to proliferate and expand its user base.  The more Facebook’s user base proliferates, the more data they have to draw from.  I am going to highlight that Facebook achieves all of this through two fundamental forms of surveillance: participatory surveillance and capital surveillance.

First value must be generated. Value is generated for big data through its production and consumption. Before we can understand how value is created we need to talk about the prosumer. In the context of Facebook, the user produces and consumes the user-generated content and metadata that is then used as big data in aggregate. So essentially, producer and consumer are collapsed into the user prosumer (Cohen 2008:7). Value is generated because the fruits of the prosumer—data through biography, interaction, and Internet usage—are sold to advertisers who then feed it back into the system as targeted advertisements. According to Fuchs (2012), the prosumer is packaged, commodified and sold (146).

Fuchs observes,

“Facebook prosumers are double objects of commodification. They are first commodified by corporate platform operators, who sell them to advertising clients, and this results, second, in an intensified exposure to commodity logic. They are permanently exposed to commodity propaganda presented by advertisements while they are online. Most online time is advertisement time” (146).

This is obviously problematic. I think it is also pretty important that we acknowledge that the role of prosumer positions the Facebook user as a free labour commodity. Cohen (2008) asserts, “Web 2.0 models depend on the audience producing the content, thus requiring an approach that can account for the labour involved in the production of 2.0 content, which can be understood as information, social networks, relationships, and affect” (8). In this process of production, Facebook repackages user-generated content and sells the data to generate intense profits (in the billions range). The user prosumer remains unpaid in this exchange. Interestingly enough, through my own work in qualitative research, those who participated in my research believed that use of Facebook’s services qualified as a fair exchange for their data. I think an apt thread of thinking that could resolve these problems, van Djick (2012) observes, “Connectivity is premised on a double logic of empowerment and exploitation” (144). With this noted, I would like to focus on the production, consumption and monetization of user-generated content.

The content produced and consumed by the user prosumer is organized through two layers of surveillance. The first layer of surveillance, is participatory surveillance. Albrechtslund (2008), in trying to address the overwhelming dystopic metaphors implicit in the discourse and study of surveillance, he explains that use of hierarchical models of surveillance (like the big brother and panopticon) obscures important sociological processes that occur through the mediation of social media (8).  Furthermore, it treats users as passive agents, unable to resist the oppressive and repressive forces of the Big Brother.  He attempts to frame surveillance as a mutual, horizontal process that empowers users through the sharing of information and creation of elaborate autobiographies. Albrechtslund elaborates that social media offer, “new ways of constructing identity, meeting friends and colleagues, as well as socializing with strangers” (8). In this understanding of social media, the subject is not a passive agent under the oppressive gaze of big brother, but an active subject pursuing empowerment. Furthermore, Albrechtslund frames user-generated content specifically as sharing, not trading. However, in doing this, he ignores that these social media platforms are constructed, shaped and owned by capitalist corporations seeking profit. This is where the second layer of surveillance becomes important—capital surveillance.

During the process of the user prosumer engaging in participatory surveillance, or in other words producing and consuming user-generated content that they share with others, the capitalist captures that data and repackages it to be sold to advertisers. They do this through complicated algorithmic computer software which than stores the data in a large architecture of computer hardware, optic wires, and servers. The fruits that become available through participatory surveillance are commodified (along with the prosumers) and then traded to produce capital. This layer, the hierarchical and oppressive model of surveillance, organizes and shapes how user prosumers generate content. Thus van Djick’s concept of the double logic of connectivity is realized. What is problematic here is that much of capital surveillance is rendered opaque or invisible to the user—who only sees the participatory aspects and the advertisements (repackaged user-generated content).  Also problematic, is that this entire process is automated–though this note will not be taken up in this article.

It is important to note that participatory surveillance is not typically a capitalist endeavour. Cohen writes, “The labour performed on sites like Facebook is not produced by capitalism in any direct, cause and effect fashion… (it is) simply an answer to the economic needs of capital” (17). So where the user prosumer “shares” their production of user-generated content, the capitalist “trades” it. These are two interconnected, though fundamentally different, processes. We, the user prosumers, don’t often recognized the capital forms of surveillance occurring, because we are so intimately involved in the participatory forms of surveillance. This, I believe, is the root to our apathy about the surveillance issues surrounding social media like Facebook. What needs to be devised next is how we can package these theories in a popular form and export them to those who are shaped by these forms of exploitative commodification. It is the work of social scientists to understand, and then to shape, the world around them.

tumblr_mgumbzX8eb1qap9uuo1_500
Big brother is watching you watch others. GIF created by Kotutohum. Find their tumblr blog here: http://kotutohum.com/

Post-script:

Another lesson we should take from this is that not all surveillance is evil.  We do not live in an inescapable dystopian society.  To say this, we obscure a lot of actual practices of surveillance that are beneficial.  We also render the notion of resistance as a practice in futility.  Surveillance is a neutral phenomenon that is used for better or worse by a plethora of different corporations, governments, non-governmental organizations, activists, and regular everyday people.  But in saying this, we can’t ignore the potential abuse and exploitation that may come from the use of surveillance practices to increase the flow of Capital.


REFERENCES:

Albrechtslund, Anders. 2008. “Online Social Networking as Participatory Surveillance.” First Monday 13(3). Retrieved Oct 9, 2015 (http://journals.uic.edu/ojs/index.php/fm/article/view/2142/1949).

Cohen, Nicole S. 2008. “The Valorization of Surveillance: Towards a Political Economy of Facebook.” Democratic Communique 22(1):5-22.

van Dijck, José. 2012. “Facebook and the engineering of connectivity: A multi-layered approach to social media platforms.” Convergence: The International Journal of Research into New Media Technologies 19(2):141-155.

Fuchs, Christian. 2012. “The Political Economy of Privacy on Facebook”. Television & New Media 13(2):139-159.

 

Surveillance @ Wayhome Music and Arts Festival: social sorting, capitalism and everyday life

12122787_1665310767059748_529043601377848990_n
Festival goers sorted by their bracelets into General Admission and V.I.P

After being apart much of the spring and summer season myself and my friend Rachelle met up in Southern Ontario to on a mission to check out Wayhome Music and Arts Festival in Oro-Medonte. If you have never heard of Wayhome (or similar festivals: Osheaga, Shambala, Bass Coast, etc.), it is a large three-day long music festival on a large strip of farmland just outside of Barrie, Ontario. For some this means a weekend snorting crystals, guzzling beer and dropping M. For others, an ecstatic rhythmic dance experience with thousands of sweaty, scantily clad bodies. For the locals Wayhome was a “misuse of agricultural land and a disturbance”. For us, it was a reunion and a bunch of musical fun. Having gone through the parts of life where dropping copious quantities of drugs was fun and cool, and no longer being prone to getting blackout drunk—we had a brilliant opportunity to observe what we had thought was going to be a colourful hippy dippy experience. However, what we experienced was a far (distant) cry from what our expectations had been. It was nothing like the life changing and spiritually ecstatic festival culture we read about in magazines or experienced over documentaries.

Though it was phenomenal to be able to move our bodies to the live playing Alt-J and Modest Mouse—we fell prey to an overt money-making, capitalist fiasco. Everything was heavily clad in sponsorship and advertisement logos. Even many of the attractions were just public relation campaigns made to hi-jack festival goer’s social media in order expand corporate advertisement reach. A slurry of beer companies, water companies, phone companies and fast food branches had set up booths amid the five main stages. Everything was expensive—especially if it was under the category of a ‘need’. Food damn well set us broke and god forbid you buy a drink from the bar.

12140830_1665310817059743_2327095235044339228_n
V.I.P Bracelets allowed for access into restricted areas.

Capitalist exploits aside, what caught me as most interesting was the festival’s surveillance infrastructure: RFID bracelets, security check points, cameras (everywhere), and even drones filming the dance pits from above. I need to note here that I am not trying to paint up an illustration of dark, mysterious festival conspiracy theories. Nor am I talking about Big Brother. But I would like to demonstrate just how surveillance is used at Wayhome to socially sort and position festival goers into different socio-economic classes. Being sorted this way—Wayhome uses various strategies to open and close doors of opportunity and shape the very experience of those who are attending and spending large sums of cash to be there. Let’s expose the mundane surveilling structures that comprise the everyday life of festival goers.

According to “The New Transparency“, an interdisciplinary team studying surveillance issues in Canada, we live in a culture that has normalized surveillance—we track, record and analyze just about any data that we can mine or scrape from people’s actions, online identities, and opinions. For better or worse we exist in a time and place that has come to rely on the use of large amounts of personal and interpersonal data. This sort of surveillance has many faces. From the bloated intelligent agencies (NSA) and whistle blowers (Snowden) to street cameras and Facebook. These technologies and strategies of surveillance are so embedded in our everyday life we take them for granted. They are in the realm of common sense. And when something falls into the realm of common sense we are less likely to notice it, let alone look at it with a critical lens.

Using smart phones to snap images and share them on social media such as Instagram or Facebook (with a sweet filter of course) is an example of what sociologist’s call “participatory surveillance“. This sort of surveillance, which may have a whole plethora of social benefits, is something we conduct together physically and digitally. Another form of surveillance, the form that relates to Wayhome, is how people are grouped together and sorted through some form of technological mediation. The technology in this case is the RFID bracelet that everyone at the festival must wear.

These bracelets were little strips of synthetic cloth, with a small RFID chip placed inside, and a locking mechanism so that you can’t take it off your wrist. According to Dr. David Lyon (2007), “These devices (RFID) rely on small tags that may be read wirelessly from a tiny antenna as the tag passes near the sensor” (113). He further elaborates that they perform categorization based on geo-locational data (ibid 113). These bracelets came in many different colours. Each colour represents a social position at the festival. Yellow bracelets were for general admissions—the lowest rung of the social ladder, the proletariat of Wayhome. Red bracelets were for VIP—which just about cost you your left kidney and child’s university savings. This was the bourgeois. There was also a diversity of bracelets for staff, artists, stage crews, media and volunteers. The whole rainbow was covered. Because these bracelets lock when they are put on it freezes any chance of social mobility, in other words, movement between different classes of people. Another important thing to note is that all festival goers were asked to preregister their RFID bracelets to their Facebook, Instagram, and Twitter accounts for social media and security purposes. This linked the physical bracelets to individual, digital information about the festival goer.

12032925_1665310873726404_1836708410380366470_n
Alt-J light show from the perspective of general admission.

According to Lyons (2007), “playspaces” or places of leisure, such as shopping malls or music festivals, have some of the most intensive surveillance (108). Much of this surveillance categorizes and sorts those who are welcome (those with bracelets) and those who are not (those who sneak in). There is an assemblage of surveillance technologies that are not quite connected, but can be drawn together in various forms to create profiles on individuals and gaze over populations in aggregate. I will write on the assemblage in another future post, but for now, you may want to read The Surveillant Assemblage by Kevin Haggerty and Richard Ericson. On another note, it would be interesting to know how much data Wayhome mines from the RFID bracelet and Facebook, Twitter, Instagram connections. Likely, it is very profitable for them.

Surveillance is everywhere. Many sociologists our heralding us a surveillance society. It is certainly about time we bring this often hidden aspect of our lives under some critical, public scrutiny. Many of the technologies are still very cryptic and mysterious in their ways of watching, categorizing, and sorting people. But the power for them to mediate our life choices is vast. From music festivals to social media to surfing the web and walking the streets. We are always watched and watching.

Kyle Curlew (@curlewKJ)


 

Related Topics:

The New Transparency – Interdisciplinary report on surveillance issues and trends in Canada – http://www.sscqueens.org/projects/the-new-transparency/about

The Surveillance Studies Center – Interdisciplinary center for studies of surveillance at Queen’s University – http://www.sscqueens.org/ The Varsity – Festival report card:

WayHome – critique of Wayhome written by a critic at The Varsity newspaper – http://thevarsity.ca/2015/08/06/festival-report-card-wayhome/


Sources:

Bennett, Colin J., Kevin D. Haggerty, David Lyon, and Valerie Steeves, eds. Transparent Lives: Surveillance in Canada. Au Press: Athabasca University, 2014.  Web. 24 Aug. 2015.

Haggerty D., Kevin and Richard V. Ericson. “The Surveillant Assemblage.” British Journal of Sociology. 51.4 (2000): 605-622. Web.  2 Oct. 2015

Lyon, David. Surveillance Studies: An Overview. Polity Press: Cambridge, 2007. Print.

 

Colombian “Shadow State”: The blending of public and private sectors in mass surveillance

Digital binary code on computer screen, pen pointing out "we're watching you" surveillance breach in red characters.
Adobe stock image

The deployment of surveillance in the 21st century digital (shit-show) of a society we live in carries some especially decentralized features. We can no longer look at the state as a central apparatus from where surveillance emerges and is conducted. And we also can’t assume that surveillance has shifted to a new center in the private industry. This “blurring of sectors” is one of the main trends in Canadian (and certainly, global) surveillance reported by Transparent Lives: Surveillance in Canada. Read this particular chapter of the report for free here. This is an interesting meditation on these important, and really complex, issues in light of the development of a “total” surveillance program in Colombia.

According to a report released by Privacy International –the Colombian state has, over the past few decades, constructed a vast surveillance net that borders on total surveillance. An apparatus that has, in fact, been used on political opponents, leftist Guerrillas, and activists in the past. This is what Vice News refers to as the “Shadow State”. A story that is shaping up to look like some sort of dystopian sci-fi. This could also be a case study in the dangers of unimpeded surveillance for state or private interest.

Let’s look at a quick recap of its development (according to Vice News)! In the 1990s, the Colombian state invested in a surveillance system called “Esperanza”. In sociology, there is this concept called surveillance creep which essentially means that once surveillance system is set up, it continues to grow and eventually take on tasks that were never its initial intention (Lyon 52). In the case of Esperanza, it was expanded over the next few decades until a new program was developed. PUMA was developed in 2007, and through surveillance creep was later upgraded to super-PUMA through a multimillion dollar investment. These systems now have the capacity to track and log phone calls and conversations to government servers to create profiles on all citizens. Much of this work is done without warrant or heed to the established laws governing intelligence agencies or state surveillance. For a much more detailed description of the story—visit the VICE article.

What I found interesting was the focus of the article on the centrality of the state government in the construction and implementation of their “shadow state”. They do discuss the private industry in the article. However, not mentioned is that there is likely to be a thin veil of separation between the involved capitalists and the state.

   “Surveillance is big money,” explained Rice. “If you sell people guns, they may come back for more guns someday, but if you sell surveillance, you immediately start providing customer support, IT services, and upgrades.”

VICE News.

To only focus on the use of surveillance to reproduce and safe guard state power is to ignore many of the other contributing factors and risky slopes that exist in these situations. One being, that a collection of private interest corporations are cashing in big time on the suffering and repression of an entire nation. Of course, this cash grab is obscured and made opaque by discourses of terrorism and crime. So not only is the state as a result becoming more powerful through draconian and cloak and dagger strategies—but a slurry of private corporation is also filling its coffers.

According to the VICE news article:

“The dozens of documents reviewed by Privacy International show that the Israeli companies Verint Systems and NICE Systems have been especially crucial in building Colombia’s electronic spying capabilities. Both have helped steadily expand the country’s “network” surveillance system, which uses a series of probes to latch on to Internet servers and collect data from 3G phone networks.”

These private corporations, I would speculate, also have access to the collected data in aggregate of an entire country’s population. The Transparent Lives report writes,

“The blurring between these agencies may be illustrated in many ways, but the effect of driving more surveillance is common to each case. Public and private bodies have different mandates and different modes of accountability, and personal data become vulnerable to misuse and abuse as the data streams flow in new directions.”

So even though abuse from the Colombian state is actually terrifying—there is at least, even if they are not always followed, a set of governing laws. Which is sometimes not the case of a private industry that is mandated to swing large profits. But the likely case seems that there is probably quite a lot of overlap between these surveillance corporations and the state interests.

It is increasingly important to see surveillance as a process that transcends traditional boundaries between public and private sectors. As these sectors, in an age of global capitalism are beginning to merge in many complicated ways.