Category Archives: Actor Network Theory

Musings of an (a)social collective: Anonymity and Community

tumblr_ojezsfb1vq1qedlsto1_500

Anonymous communities can easily be mixed up with as a thick mess of senseless social interactions. At least, that is how I saw this world when I first decided to study anonymous communities for my Master’s thesis. I thought I would study how surveillance operates in anonymous social media applications—specifically, a very popular (at the time) application called Yik Yak.

Just a side note: Yik Yak had gone into a sudden bout of madness and removed the ability to be anonymous from their application. After a complete revolt of their user base (they just about all left), they switched back. But the feed is still a smouldering ruin of regret and nostalgia. To simplify this argument for the sake of a blog post—let’s pretend that the application did none of this. Let’s make an ideal form: an anonymous community.

unnamed

When I first downloaded the app a month before I decided to dedicate two years to it—my room mate had convinced me to check it out. An seemingly infinite central feed of anonymous comments that were sorted by a slurry of up-votes and down-votes. The Yak feed is tied to a geolocational system that connects the app to particular locations. My Yak, was the Queen’s University Yak. It was a busy feed. And it was constantly changing. To me, it seemed to be a chaotic and nebulas thick tangle of associations. A fun challenge for a scholar following and Actor-network inspired philosophy.

The popular posts stood out from the unpopular posts by an upvote/downvote feature. It was kind of like a mash between Twitter and Reddit with a touch of anonymity.

After a stint of digital ethnographic work and a ton of interviews with enthusiastic and committed users I began to see something else. Something that, as an outsider, was invisible to me at first. There was an elaborately balanced Yik Yak community. As Gary T. Marx asserts, anonymity is entirely a social process. The only way for anonymity to occur is through a faceless interaction with another faceless person. This includes social regulations, exploitations, and oppressions. But also, playfulness and a culture of care.

I would like to play with a concept I’m thinking of called (a)social. ‘a’ can be used as a negation. ‘a’ can also be used to represent anonymity. But mostly, ‘a’ will be used to approach a society which remains almost entirely faceless. A community of people interacting around nothing more than posts from people who occupy similar space. Similar cultural values.

Though I have major problems with the corporate side of Yik Yak with their capitalist motives and try-hard branding schemes, their application has facilitated the construction of an elaborate community. It’s created an (a)social experiment. It is a community that both contains a culture of trolling and a culture of care.

All things are a collective endeavor. The (a)social communities are also a collective endeavor. In Donna Haraway’s most recent philosophical publication, Staying with the Trouble, she discusses her concept of sympoiesis—a collective unfolding of reality. This collective includes everything. All human, inhuman, and nonhuman components that are threaded into the collective mess.

When we load up Yik Yak to our mobile phones and post snippets of thought to the main feed (or engage in grueling arguments over all controversies in the comments)—we work with silicone, wires, codes, telecommunication companies, algorithms, molecules, humans, bots, and entire scaffoldings of bureaucracies, legal frameworks, and governments. Interacting with the Yak spans the world over.

Furthermore, the Yak’s platform—allows particular functions and blocks others—shaping its users to interact in particular ways. They impose standards, through their Code of Conduct, which they enforce through algorithms looking for offensive key words. And they sometimes change up everything in an update (to remove their main feature, anonymity). These are the institutional forces that shape and provide stability to the community.

However, I have noticed that there is something more powerful at work in maintaining the community. It seems that the mess of interactions from users balance out particular norms and ways of acting. This is done through both the comments section and the up-vote/down-vote feature. These are the vernacular forces that generate norms and cultures. Certain topics, maybe, offensive topics, are down-voted (a -5 score from votes deletes the comment from the feed). This vernacular power, though institutionally enabled, allows for a regulation of trolls and bullies without Yak’s employees ever having to get involved.

(a)social sympoiesis initially looks like a senseless and dense knot of relations. It’s noisy and confusing. But, once, as an ethnographer, you begin the arduous work of untangling these associations—it begins to look like every other community. Despite all of the contradictions, despite the arguments, the controversies, and the confusing faceless interactions—the Yak community is able to balance out, stabilize, and “hang together” as a coherent whole.

Though such an (a)social collective is not shielded from the larger world. Once, for whatever reasons or motivations, Yik Yak decided that their users didn’t want to be anonymous and forced every user to get user handles (and suggested they link up their Facebook page)—the entire community collapsed. All that is left are groups of Yak “refugees” with no where to go but to be visible to the world.

Oligoptica: Why Surveillance Isn’t Perfect?

We have all likely heard of the panopticon. An architectural design of a prison, thought up by Jeremy Bentham, that was suppose to maximize surveillance capacities so that prisoners always felt as if they were being watched. Even when they weren’t. It consisted of prisons revolving around a central guard tower that could watch every move of every prisoner, all the time. However, the guard tower is made to be opaque—so the prisoners can’t watch the guards.

panopticon

In 1975, Foucault borrowed this idea to illustrate his concept of disciplinary power in one of his most famous books—Discipline and Punish. The basic idea around Foucault’s use of the panopticon is that when people feel as if they are constantly being watched, they begin to self-discipline. The panopticon can refer to a prison. But it is meant to refer to society in general. Or many of the institutions in a society. The more people feel that they are being watched, the better they act. This watching could be through authorities, or even, your neighbors.

Though Foucault’s concept of disciplinary power is super important to many who study sociological theory—his example of the panopticon is overused and often misleading. It does not accurately represent the nature of surveillance in contemporary society.

The idea of the panopticon better characterizes a society of “total surveillance”. A completely, balls-to-the-walls, 1984, Big Brother-type (dys)utopia. Thankfully, there is currently no technology on earth that can allow for total surveillance.  We may be a society of ubiquitous surveillance, but not a society of total surveillance.

So how do we “move beyond the panopticon”, as so many social and cultural theorists have been calling for? There is one useful theoretical framework that builds on Foucault’s work. This is the concept of the oliopticon. A concept that was proposed by Bruno Latour during his incredibly critical arguments of Reassembling the Social.

Latour criticizes Foucault for drawing up a total surveillance “utopia” that is made of “total paranoia and total megalomania”.

giphy

He writes,

“We, however, are not looking for utopia but for places on earth that are fully assignable. Oligoptica are just those sites since they do exactly the opposite of panoptica: they see much too little to feed the megalomania of the inspector or the paranoia of the inspected, but what they see, they see well…”

 

Latour is staunchly reminding us that something that is everything is nothing at all. The panopticon is made to be too perfect. It is made to see all. It’s something, that as academics, we can’t possibly empirically record or understand. But the oligopticon is the existence of countless scopes meant for watching. Countless surveillance devices. They only see everything together, but because they rarely communicate it could hardly be called “total surveillance”.

Latour continues,

“From oligoptica, sturdy but extremely narrow views of the (connected) whole are made possible—as long as connections hold. Nothing it seems can threaten the absolutist gaze of the panoptica, and this is why they are loved so much by the sociologist who dream to occupy the center of Bentham’s prison: the tiniest bug can blind oligoptica”.

However, this does not entirely rule out the panopticon. As Kitchin and Dodge in their book Code/Space assert, the power of codes and algorithms may some day be able to unite many of the streams of the oligoptica to create a menacing panoptic machine. However, due to the unstable nature of the practice of scripting code, running code, and working hardware—it is liable to bugs, errors, and absolute mutiny. So don’t hold your breath.

The panopticon, for now, has its place—but it’s a more appropriate theme for a science fiction novel than a good work of social science or philosophy. It serves as a powerful reminder of where a ubiquitous surveillance society could lead us, but not as a very good characterization of surveillance today.

rs_560x415-140917143530-1024.Tinder-Logo.ms.091714_copy

‘Software Mediated Intimacy’ in Tinder: have we ever been in love?

Tinder has become an almost ubiquitous dating app that has built quite a lot of controversy even as it sits in most of our mobile phones.

While users swipe left and right searching for people they might be interested in, questions of legitimacy and authenticity in love, intimacy and dating arise. I was motivated to write this blogpost after a debate with a great friend of mine about the authenticity of love in Tinder.

Questions arose for me: If love is a social/cultural construct, is it authentic? What does Tinder do to change how love manifests itself? What has changed that we can’t right away see?

These tensions hang about within an atmosphere of heavy technical mediation that silently organizes, classifies, and structures how users interact with each other.

In an article by Fast Company, Austin Carr had the wonderful opportunity to interview Tinder CEO Sean Rad about the algorithm that measures user desirability and matches users to their equivalents. This is the ‘Elo score’ program.

Tinder users do not have access to their desirability rating, however, this rating does predetermine who gets to see who. It brings together a large and heterogeneous set of data on users in order to measure their desirability. Users only have access to viewing profiles that share a similar scoring.

Algorithms are active mediators in the construction of networks of intimacy. I would like to call this ‘software mediated intimacy’. Before we explore this concept we need to understand the basics and histories of two concepts: actor-networks and intimacy.

On actors, networks and the silent power of algorithms

Actor-networks are a set of conceptual tools originally devised and made popular by science and technology studies scholars Latour, Callon, and Law.

The most fundamental distinction of this school of thought, sometimes referred to as actor-network theory (ANT) is the principle of generalized symmetry. This concept holds that humans and nonhumans (atoms, systems, computers and codes) are equal in their ability to shape intentions of actors in networks of associations.

An actor, which can be human and/or nonhuman, is made to act.

This act is always done in relation to other acts.

So actors, together acting, create actor-networks. Humongous, unpredictable webs of associations.

This model of understanding society—through complex, overlapping networks of humans and nonhumans, is very useful for understanding how actors shape each other over social media platforms.

There are countless nonhuman’s working in sprawling actor-networks that shape, structure, and mediate how we interact with others online. Most of this remains hidden within our electronic devices. Strings of codes and wires affect us in ways that are sometimes more meaningful than our interactions with humans.

Some ANT scholars call this the blackbox—where a vast actor-network becomes so stable it is seen to act together and it becomes a node in other, more expansive actor-networks. Think about how many blackboxes exist in the many components of your mobile devices.

I would go as far to say that most Tinder users are unaware how much their romances are mediated by algorithms. These algorithms are typically trade secrets and the musings of Science and Technology scholars—not quite specters in the imaginations of users.

On the social construction of love and intimacy

Love is not universal, it is a complex set of changing associations and psycho-socio-technical feelings that are tied to space, time and historical circumstance.

Anthony Giddens, in his book The Transformation of Intimacy, demonstrates the social and cultural influences that actively construct how we understand, perceive, and pursue, love and intimacy.

In the past, the pre-modern, relationships were forged through arranged marriages that were mainly tied to economic priorities. This is something that Foucault, in his work The History of Sexuality, called the deployment of alliance—in other words, the creation of social bonds and kinship alliances that shaped the distribution of wealth and ensured group and individual survival.

In a complicated move that I am unable to detail here—the deployment of sexuality emerged that changed the very nature of the family. It also led to the love and intimacy that we know today—a social bond that is not necessarily tied to the family unit.

Gidden’s follows, as Western societies began to experience the characteristics of modernity—intimacy and love began to emerge alongside it. Romance emerged when the family institution of premodernity collapsed.

Why am I talking about this? I am talking about the work of Giddens and Foucault in order to illustrate the love and intimacy are a constantly shifting social and cultural construct. Every generation experiences love and intimacy differently.

This is probably the reason why your parents scowl about your one night stands and the fact that you haven’t given them grandchildren yet.

Love and intimacy is going through a massive shift right now.

‘Software mediated intimacy’—algorithmic love

In his essay A Collective of Humans and Nonhumans, Latour speaks about our progression into modernity as a deepening intimacy between humans and nonhumans. An integration between humans and their technology. A folding of all sorts of intentions and agencies.

In this case, as humans become further enveloped in their associations with mobile devices and ubiquitous social media. Most of our interactions become mediated through strings of codes, complex algorithms and digital hardware.

In the case of Tinder—the algorithms are mediating and structuring how we meet and communicate with people who may become love interests. Of course, these algorithms don’t exactly control such love. Human actors enter into associations with the Tinder actor-network with all kinds of intentions and expectations.

Some actors want a fling, a one night stand. Others might want a long-term relationship. Others may even just want attention or friendship.

So even though Tinder is sorting who gets to talk to who—a heterogeneous and constantly shifting clusters of intentions are constantly shaping how Tinder-based relationships will turn out.

And it’s not perfect. Love and intimacy falter and wilt just as much as it blossoms.

But that’s not my point. ‘Software mediated intimacy’, like all of the forms of before it, is another form of social and culturally curated love and intimacy. So it’s not authenticity that is the primary question here. All forms of love emerge from some degree of construction or performance. There is no universal standard.

However, what is different and what is often not seen is the degree to which love can be engineered. Computer scientists and engineers, at the beckoning of large (or small) corporations, embed particular logics and intentions into the algorithms they construct.

As Dodge and Kitchin remind us, in their book Code/Space, this is not a perfect product of social control—but a constant problem to be shaped. Even so, it is disconcerting that so many users are being shaped by silent human and nonhuman mediators.

Tinder’s algorithm and the ‘Elo Score’ is a trade secret. Not opened up to the public. Or the academe. So I am left scrambling for an answer that only raises more questions.

‘Software mediated intimacy’ can offer us a new and novel way to construct relationships, whether they are temporary or sustained. It is a form of social interaction that is just as authentic as prior forms of love and intimacy. However, the codes and algorithms and the complexities of computer science and statistics make it overwhelmingly easy for corporations to shape users based on private interests.

We must learn how these processes work and how they might be democratized so that the user may benefit as much as the capitalist who produces and hordes such software. We must embrace ‘software mediated intimacy’, in order to learn about it and master how it works so we might sidestep the potential and precarious exploitation seeping through our engineered social bonds.

Note: This blog post will be followed by a more in-depth analysis of this understanding of Intimacy. I am seeking to expand this into a larger, theoretical project. Any comments, criticism, or thoughts would be incredibly useful.