Category Archives: Uncategorized

Stumbling into trans dykehood: the making of a queer love story

CW: Gender Dysphoria, Cheesy Story

I met my life partner at the Reelout Queer Film & Video Festival in Kingston, Ontario. It was this event that would foreshadow our future together as a queer lesbian couple. At the time, I was still identifying as a cisgender man and had hidden my gender identity under a thick layer of masculinity, muscles, and ginger red beard. None-the-less, I felt queer in my heart and decided to go on a friend date to see some rad films. We watched a steamy flick of two gay men in Columbia, a barber and a soldier, who shared an overnight love-fest in the barbershop. And we also watched a very upsetting story about a trans sex worker who was nabbed and murdered by a transphobic asshole.

I was in the closet as a trans woman, but out as a bisexual man. And my partner had not yet reflected on her queerness and was never confronted with the opportunity to explore it. When we were walking home, snowflakes floating down from the night sky, I asked her if she would go on a date with me. She hadn’t been in the dating scene for some time and was caught by surprise. She paused to think and mustered a yes. I walked home grinning.

Our first date was in her house. It was a crowded house with stinky carpet and many housemates (all lovely folks, of course). We had a homemade sushi night and stayed in with some wine. I had worked as a server for a sushi place back in Newfoundland and I was able to cobble up some rough looking rolls. As it turned out, we both loved food and we bonded over that hard.

It was a while before we started going steady. I was immensely shy, and she was uninterested in committing to a label. This was a wonderful way for us to progress through the various stages of love. It allowed us to nurture a non-possessive and not-so-jealous attitude with each other. We could sleep in the same bed with friends and cuddle with loved ones and be happy for each others various life intimacies.

The more time we spent with each other, the more we realized that we had some rad synergy. I told her, months later, on a trip to Montréal for a conference, that I loved her. She agreed, and from then on, we were going steady.

I had problems with my sense of embodiment, and that left me with countless insecurities. I had decided several years prior that I could never be a woman and I was terrified of the backlash from my family who were invariably anti-queer. I took on a hushed-up label of gender queer, all of the while moving into lifting culture at the gym. I gained a substantial amount of muscle, and for a while my body felt good in being distracted by the constant strain of regimented exercise. I had mentioned in passing to my partner that I was gender queer. But I tucked away my issues with gender into the deeper recesses of my mind and forcefully forgot about them.

Almost two years after Reelout, I moved to Ottawa to start a PhD program. It was a tough move, but we had decided that we could make it work. It was difficult at first, but it worked out. We would Skype often and send each other love letters. I tried to get her to join a Minecraft server with me, but she wouldn’t have any of that. There were many hurdles, but it was worth the work we put into it.

Two months into my move, I was sitting in lecture, a class I worked for as a teaching assistant, and the professor was instructing a sea of 400 undergraduates about the complexities of gender and sexuality. To illustrate the textbooks somewhat dull explanation of (trans)gender realities, she put on a short documentary about a trans woman coming out of the closet, and the struggles she encountered with her family and her partner.

I had a sudden ball of pressure in my chest, and I almost started crying. It was that moment, as I was about to turn 29 years old, that I realized I was a trans woman. I bumbled through my tutorial lesson and managed to keep my calm disposition, but the seed was planted, and my mind was making connections between the discomfort I held with my body and the potential undercurrent of gender dysphoria. I called my partner when I got home to inform her that she was indeed dating a woman. And to my surprise, she did not panic or freak out. In fact, she was very supportive. Yes, I have a rad girlfriend!

I cried myself to sleep because I had no idea what to do. The next day I watched a bunch of YouTube videos, learned about the transitioning process, and began to make connections between my life experiences and my womanliness. That evening, I emailed my dad a panicked message to tell him the truth. That was a struggle that I will never forget. We don’t talk anymore.

She took the bus to Ottawa as soon as she was able, and though she didn’t tell me until later, she did a ton of research herself. While I was shaving my beard and learning about make-up, she was consulting our queer and trans friends so that she would know how to approach this without bombarding me with questions and anxieties.

When she arrived, we sat on the bed in silence and I eventually mustered enough courage to tell her a super difficult truth. I said, “I don’t want to hold you hostage. If you need to leave me because this is too much, I would totally be okay with that”. I was afraid that she would force herself to stay, even if she didn’t find me attractive. My whole life, I was fed narratives of the repulsiveness of being trans. I was saturated in internalized shame and I believed that no one could possibly love me.

This was unintentionally upsetting for her. She was aware that I didn’t have a conventional gender orientation and she saw through my masculine ruse from the beginning. In fact, she was already embracing her new lesbian identity and had already come out to her family, who accepted both of us and all our queerness. Even while I was struggling with the idea that I was a woman, she had already accepted it wholeheartedly.

We kissed and she’d later reflected on how my lips were so soft without the thick tendrils of my ginger beard.

The next morning, we listened to cutsie queer music and she walked me through the clothing and make-up she and some queer friends gathered during a collective closet raid. We went to the mall together to buy some womanly things and a ton of cheap make-up. I was terrified. I felt naked walking through the mall without my outer layer of masculinity, muscles, and beard. I felt so exposed to a hostile world, but she squeezed my hand and led me around from store to store. It would be a very long time before I could go to a woman’s store alone. That night she waxed my body and dealt with all my pain. We drank wine and talked about how we met, that night at Reelout.

 

Happy Pride  everyone <3

The DIY Gender Police: doxxing through visibility and ubiquitous presence

This is the second post of a small series on DIY gender police, or anti-trans activists who take it upon themselves to police and harass trans writers, advocates, and scholars in order to reverse our access to human rights, public space, and pride and dignity.

Read part one here: The DIY Gender Police: the surveillance of trans folks by anti-trans activists.


CW: transmisogyny, harassment, suicide

After coming out of the closet as a trans woman, my ability to engage in public discussions as a writer radically shifted. My new identity substantially intensified the stakes of publishing critical ideas as I was forced to come up against anti-trans hate groups on the left and the right.

I mustered up the courage to transition a few weeks before the Lindsay Shepherd controversy at Wilfrid Laurier University which would rapidly become a rallying cry for the far-right, who manipulated arguments in support of free speech to dog whistle white supremacy and (trans)misogyny across the Canadian mediascape. I wrote an article for Vice Canada called For Trans Folks, Free Speech Can Be Silencing to address how open debates about trans and non-binary pronouns often dehumanize and silence trans students in undergraduate classes. I mean, imagine being made to debate your own existence in a classroom setting!

This was the first time I had an encounter with the trolly hate group known as Kiwi Farms. I remember getting a Google notification not long after I published my Vice article informing me that my name had been mentioned on the Internet. I was blissfully unaware of doxxing groups before checking my gmail account that day and I was appalled by their cruelty. I had been doxxed, and I felt violated and vulnerable in the visibility and exposure afforded to me by the Internet.

Along with bemoaning that social justice warriors (SJWs) who wanted nuanced discussions about free speech were somehow ushering in a dark era of Orwellian or Huxley style totalitarianism, Kiwi Farm trolls also attacked me based on my appearance and my gender.

One post read, “If accommodating the 0.1% or so of people who are trannies involves destroying free speech for everyone else, fuck trannies”.

Another followed up, “I thought he just kind of an ugly girl, not a troon”.

This was my first time getting doxxed. As I mentioned in the previous article in this series, doxxing involves active lateral surveillance and intelligence techniques used by a person or group to scour the Internet for any publicly available information that is collected into rough dossiers and posted to cyberspace to engage in punitive “name and shame” tactics. Doxxing is the primary strategy in the  DIY policing toolkit, and it’s widely used within the Kiwi Farms community.

In fact, I will likely get doxxed again for mentioning my experiences with Kiwi Farms as they thrive on negative press. It took me a while to decide whether or not I should tell this story as it will give these trolls more ammunition to shoot back at me. But these assholes need to be challenged, and silence, I feel, is no longer an option.

Another user wrote, “They do it to escape their insecurity or their mistakes from their male self. Unfortunately, the Internet never forgets, nor does their body, which is male”.

They’re right, the Internet never forgets. Trolls and bigots are able to exploit the visibility and ubiquitous presence provided to us by our reliance on social media platforms and near constant connection to the Internet. Kiwi Farms is a prime example of DIY policing in that it has allowed for home brewed vigilantes to play both spy and police officer by weaponizing our visibility to threaten us into silence. It’s also worth noting that they also take joy in attacking people with disabilities and plus size women.

Visibility and ubiquitous presence

Though folks engage in social media to varying degrees, it is safe to say that most of us spend a great deal of time producing and consuming user-generated content. Many of us use social media like Facebook and Twitter to build online social identities and we curate those accounts to give off impressions of who we are. Social media platforms have become synonymous with communication in the contemporary Western world, and this has massive consequences.

Sociologist danah boyd offers us a useful concept to think about our engagement with social media platforms. She draws attention to how social media become “mediated publics” where folks communicate through technologies that shape (or mediate) our interconnections with each other. In line with physical public spaces, mediated publics allow for people to interact with each other, but these interactions are augmented by features unique to cyberspace.

Navigating mediated publics are characterized by persistence, searchability, replicability, and invisible audiences. In other words, interactions in mediated publics endure through time, are easily searchable, can be copied outside of its original context, and are seen by an unknowable number of strangers.

For instance, Kiwi Farms homed in on embarrassing thoughts that I posted to Reddit during a time where I was confused and questioning my gender. Though I won’t go into the nitty gritty details, I posted these thoughts several years ago without foresight that they would be found and used to embarrass me years later. The details that I posted on a trans subreddit were eventually archived, copied by trolls, removed from their original context, and used in a doxx meant to embarrass me in a public full of hostile strangers.

Because social media platforms have become a constant staple in how we communicate, our presence in mediated publics become ubiquitous. We are exposed to publics that might seem harmless, but can quickly dissolve in a cacophony of vitriolic bullshit.

As we navigate mediate publics, overtime we produce substantial social exhaust. This is a form of seemingly innocent enduring data that can be brought together in countless ways and to varying effects. Surveillance scholar Daniel Trottier, notes, “No single act seems risky or malicious, but when taken together overtime, maintaining an online presence can have damaging consequences”. It is this social exhaust, the fragments of a person’s digital identity, that become the weapons of DIY gender police.

Doxxing as political violence

As I mentioned in the first article in this series, activists, scholars, and journalists often focus on the dangers of state-level hierarchical surveillance while neglecting the impact of lateral surveillance practices used in everyday life by everyday people. This is often done in a way that obscures or obfuscates attention to the violences involved in lateral surveillance practices. For a lot of folks, the damaging impacts of DIY gender policing are opaque, and thus, rarely discussed outside of the marginalized groups who face the blunt of such tactics.

Earlier this summer a trans game developer named Chole Segal ended her life after substantial harassment from trolls and doxxers over Kiwi Farms. Though Segal’s tormentors weren’t the sole cause of her dying by suicide, they played a terrible role, and this marks some of the more extreme consequences of doxxing in the trans community.

Gay Star News reported, “Kiwi Farms linked to her death. On the thread there was no regret, only misgendering and mocking”.

Doxxing in inherently violent in that it violates the assumed privacy of a person by collecting disparate forms of social exhaust given off by a lifetime using social media in order to cause a person personal damage.

While speaking about surveillance, Fuchs and Trottier observe, “Surveillance gathers data about humans in order to exert actual or potential direct, structural, or cultural violence against individuals or groups. The violence involved in surveillance either operates as acutal violence or as the threat of violence in order to discipline human behavior”. Doxxing isn’t a mundane or inconsequential act, it is an intentional act of violence that is meant to do harm to people.

The communities that engage in DIY policing are accountable to no one but themselves, which sets them apart from state agencies who are at least marginally tied to a legal system. There are few ways that a person can seek justice after being victimized by anonymous and pseudonymous vigilantes who enact extreme forms of discriminatory violence.

It is important that we begin to address these issues in ways that will provide us with tools and strategies to resist DIY gender policing, ubiquitous presence, and (trans)misogynistic violence. Furthermore, we need to strategize ways of building tighter communities of support over cyberspace, as well as queer, feminist security practices that we can utilize to protect ourselves from forms of weaponized visibility. In the next addition to this series, I will explore how far-right groups use media manipulation and forms of digilantism to actively work towards the marginalization of people of color, LGBTQ folks, and women.


In the coming weeks, I will be exploring some key concepts and ideas around how trolling, doxxing, e-bile, and vigilantism over digital platforms have been seriously impacting trans communities in extraordinarily violent ways. DIY policing, and its vast arsenal of techniques, seems largely opaque in cishet (cisgender, heterosexual) society, and because of this, is mostly ignored as a form of active discrimination. We need to make this form of political mobilization visible and start having a serious conversation on how we might collectively address it.

References

boyd, d. (2007). Social Network Sites: Public, Private, or What? Knowledge Tree 13. https://www.danah.org/papers/KnowledgeTree.pdf.

Fuchs, C., and D. Trottier. (2015). Towards a theoretical model of social media surveillance in contemporary society. Communications 40(1): 113-135.

Trottier, D. (2017). Digital Vigilantism as Weaponisation of Visibility. Philosophy and Technology 30, 55-72. https://doi-org.proxy.library.carleton.ca/10.1007/s13347-016-0216-4.

Free speech, messy epistemologies, and the reframing of the WLU controversy

A trimmed down, edited version of this article was published in The Conversation.

Free Speech rally at Wilfrid Laurie University

The Lindsay Shepherd controversy has opened the Pandora’s Box once again on the notorious, vitriol-ridden “free speech” debate across Canada. It has largely consisted of tired arguments penned up in op-eds advocating that the university has become home to left-wing authoritarians who muzzle the speech of those with whom they disagree. Such debates have become so politically noxious that Andrew Sheer, leader of the Conservative Party of Canada, has jumped on board—calling for a political response to the free speech crisis in universities. Furthermore, Lindsay Shepherd has become an alt-right darling in the amplified calls for free speech on campus; she now has roughly nineteen thousand followers on Twitter and is consistently feeding the fire with toxic tweets. One tweet reading, “Confirmed: WLU is a mental institution”.

Debates about free speech have a tendency to become unnuanced and flat as they typically amount to blanket statements that call for the unbridled and unrestricted ability to say anything. As I explored in an article for Vice, such an understanding of free speech is complicated sociologically when superimposed on a society already stratified along vectors of identity. Free speech becomes even more nebulous when we read the sub-text of free speech advocacy which often cozies up with white supremacy, transphobia, and sexism.

I want to shift the discussion about free speech. Instead of focusing on why the academy needs free speech, I would like to ask how free speech is reasonably deployed in the scholarly pursuit of knowledge production. This analytical shift will allow us to move beyond romanticised notions of free speech and academic freedom and consider the various ways in which knowledge emerges and becomes entangled in institutional practices and professional obligations.

This debate has by-and-large ignored the ways in which knowledge is produced and shaped within the academy. I would like to suggest that the epistemological insights of science, technology, and society (STS) can provide a scaffolding to understand the complexities of free speech in practice, as opposed to free speech in theory. Epistemology is the study of knowledge production. Despite the centrality of knowledge in all our social encounters, epistemological issues are often undervalued. Donna Haraway illustrates the importance of the structures of knowledge in this beautiful quote, “It matters what matters we use to think other matters with; it matters what stories we tell to tell other stories with, it matters what knots knot knots, what thoughts think thoughts, what descriptions describe descriptions, what ties tie ties.”  The various shapes of the in-between matter that inform how we understand the world have consequences for how we frame free speech.

The production of knowledge in the academy is laden with formal and informal processes that shape how knowledge is produced, debated, disseminated, and taught. The sociologist, John Law, provides a useful framework for approaching knowledge production in the academy. He draws our intention to the messiness of the world in practice;how human emotions, scientific methods, institutional priorities, research ethics boards, peer review committees, professional reputations, class syllabi, employment contracts, graduate student committees, and codes of conduct become entangled when we go about the business of saying or writing something. When we talk about free speech, these constraints are made opaque despite their centrality in shaping how we talk, write, and debate.

A graduate student, depending on their discipline and department, will typically take graduate courses, be employed as a teaching assistant (TA), and research their independent thesis work. In order to guide a budding scholar through the complexities of academic research and politics, a grad student works under a committee. Such committees are made up of professors who have been rewarded PhDs for their familiarity and experience as academics. One of these committee members is the grad student’s supervisor who very closely guides that student’s academic work. All research produced for the student’s thesis must be rigorously checked by their supervisor and committee. This leads to a painful process of sending in drafts and receiving back red marks. Such a process shapes what knowledge is reliable, rigorous, and fair, and what knowledge is inappropriate, poorly thought out, and not defendable. If a student ignores their committee, they will likely fail their thesis defense and not receive a degree.

If a grad student were to write their own independent research, to give it credibility they would need to submit it to an academic journal. All reputable journals use a peer-review process where a committee of scholars assess the quality, reliability, and credibility of academic work and reject work that doesn’t meet academic standards. Poor research is sent back to the grad student to be revised or sent elsewhere. And some work is rejected for not meeting the criteria of the peer reviewers. Grad students need to have tough skin, as we will get torn to pieces several times a year.

There are other ways in which academic knowledge is reasonably shaped. Research on human subjects is tightly controlled by General Research Ethics Boards (GREB) that are informed by federal policy and legislation. If a grad student ignores GREB, they could be expelled and have their credentials revoked. Academic conduct is held to a Code of Conduct and other university policies that shape how scholars can interact with each other. And ultimately, the university must abide by the Criminal Code and the Charter of Rights and Freedoms that protect people in the scholarly community against hate crimes and discrimination.

Finally, a grad student typically becomes a TA to help fund their studies. This is an admittedly precarious job that usually have students working overtime with no extra pay. The TA signs an employment contract, works under a professor who teaches the course, and does not have any authority to teach their own content. TAs do not have the same academic freedom as professors. Aadita Chaudhurry, a PhD student at York University, penned up an article that appropriately delves into how Shepherd failed at her obligations as a TA. Grievances with professors are often mediated through a public service union that advocates on behalf of the grad student.

These formal restrictions on how knowledge is produced are complemented with informal occupational norms that are enforced by students, faculty, and administration. This is the everydayness of the academy. A grad student can’t just write anything. Everything a grad student does in public (including their publications) are informally assessed by colleagues and professors. Miscalculations or poorly thought out work can negatively impact the future of a grad student.

To engage in proper research in the academy is to maneuver through the tangled red tape of policies, expectations, institutions, regulations, and professional obligations. This has a grad student dancing and staggering back and forth through research and teaching and negotiating and compromising on the substance of their scholarly practices. This is the messiness of epistemology in practice. These processes are all swept away in popular debates around free speech in the academy. Such arguments are far too easy because they ignore how the academy functions as a complex institution and community.

And don’t get me wrong. None of these processes or practices are immune from criticism. But that is an entirely different discussion than the one being advocated by Lindsay Shepherd. Academic freedom is certainly important, but so are the ways in which it can and cannot be practiced. University administration, faculty associations, and student and labor unions are constantly in friction over how these limits should take shape. These are discussions that are always already happening and do not get near the press attention that Shepherd’s employment bungle has attracted. If the academy is in crisis, its critics are focusing on the wrong issues.

Musings of an (a)social collective: Anonymity and Community

tumblr_ojezsfb1vq1qedlsto1_500

Anonymous communities can easily be mixed up with as a thick mess of senseless social interactions. At least, that is how I saw this world when I first decided to study anonymous communities for my Master’s thesis. I thought I would study how surveillance operates in anonymous social media applications—specifically, a very popular (at the time) application called Yik Yak.

Just a side note: Yik Yak had gone into a sudden bout of madness and removed the ability to be anonymous from their application. After a complete revolt of their user base (they just about all left), they switched back. But the feed is still a smouldering ruin of regret and nostalgia. To simplify this argument for the sake of a blog post—let’s pretend that the application did none of this. Let’s make an ideal form: an anonymous community.

unnamed

When I first downloaded the app a month before I decided to dedicate two years to it—my room mate had convinced me to check it out. An seemingly infinite central feed of anonymous comments that were sorted by a slurry of up-votes and down-votes. The Yak feed is tied to a geolocational system that connects the app to particular locations. My Yak, was the Queen’s University Yak. It was a busy feed. And it was constantly changing. To me, it seemed to be a chaotic and nebulas thick tangle of associations. A fun challenge for a scholar following and Actor-network inspired philosophy.

The popular posts stood out from the unpopular posts by an upvote/downvote feature. It was kind of like a mash between Twitter and Reddit with a touch of anonymity.

After a stint of digital ethnographic work and a ton of interviews with enthusiastic and committed users I began to see something else. Something that, as an outsider, was invisible to me at first. There was an elaborately balanced Yik Yak community. As Gary T. Marx asserts, anonymity is entirely a social process. The only way for anonymity to occur is through a faceless interaction with another faceless person. This includes social regulations, exploitations, and oppressions. But also, playfulness and a culture of care.

I would like to play with a concept I’m thinking of called (a)social. ‘a’ can be used as a negation. ‘a’ can also be used to represent anonymity. But mostly, ‘a’ will be used to approach a society which remains almost entirely faceless. A community of people interacting around nothing more than posts from people who occupy similar space. Similar cultural values.

Though I have major problems with the corporate side of Yik Yak with their capitalist motives and try-hard branding schemes, their application has facilitated the construction of an elaborate community. It’s created an (a)social experiment. It is a community that both contains a culture of trolling and a culture of care.

All things are a collective endeavor. The (a)social communities are also a collective endeavor. In Donna Haraway’s most recent philosophical publication, Staying with the Trouble, she discusses her concept of sympoiesis—a collective unfolding of reality. This collective includes everything. All human, inhuman, and nonhuman components that are threaded into the collective mess.

When we load up Yik Yak to our mobile phones and post snippets of thought to the main feed (or engage in grueling arguments over all controversies in the comments)—we work with silicone, wires, codes, telecommunication companies, algorithms, molecules, humans, bots, and entire scaffoldings of bureaucracies, legal frameworks, and governments. Interacting with the Yak spans the world over.

Furthermore, the Yak’s platform—allows particular functions and blocks others—shaping its users to interact in particular ways. They impose standards, through their Code of Conduct, which they enforce through algorithms looking for offensive key words. And they sometimes change up everything in an update (to remove their main feature, anonymity). These are the institutional forces that shape and provide stability to the community.

However, I have noticed that there is something more powerful at work in maintaining the community. It seems that the mess of interactions from users balance out particular norms and ways of acting. This is done through both the comments section and the up-vote/down-vote feature. These are the vernacular forces that generate norms and cultures. Certain topics, maybe, offensive topics, are down-voted (a -5 score from votes deletes the comment from the feed). This vernacular power, though institutionally enabled, allows for a regulation of trolls and bullies without Yak’s employees ever having to get involved.

(a)social sympoiesis initially looks like a senseless and dense knot of relations. It’s noisy and confusing. But, once, as an ethnographer, you begin the arduous work of untangling these associations—it begins to look like every other community. Despite all of the contradictions, despite the arguments, the controversies, and the confusing faceless interactions—the Yak community is able to balance out, stabilize, and “hang together” as a coherent whole.

Though such an (a)social collective is not shielded from the larger world. Once, for whatever reasons or motivations, Yik Yak decided that their users didn’t want to be anonymous and forced every user to get user handles (and suggested they link up their Facebook page)—the entire community collapsed. All that is left are groups of Yak “refugees” with no where to go but to be visible to the world.

The Mythology of Pokémon Go: Surveillance, Big Data, and a Pretty Sweet Game

Pokémon Go is lulling the world in to a humungous augmented distraction. A distraction that is covering up some pretty intense politics. It is almost as if we stepped into Ernest Cline’s Ready Player One—where distraction through virtual reality meets the war between anonymity and surveillance.

PokemonGo3
Artist: Dani Diez. You can find more of Dani’s work at www.instagram.com/mrdanidiez/

It has been well publicized that this new app, of which is fueling a Pokemania (a nostalgic resurgence of interest in Pokémon every time a new rendition of the game is released), has some rather arbitrary and invasive access to your mobile phones data—particularly, unhinged access to your Google account and other features of your mobile device.

What is Pokémon Go? This almost seems pointless now, seeing the popularity of the game—but for those of you who have not tuned in to the pokemania. Pokémon was a TV show released in the late 90s, which became dream fuel for a generation of children and young adults. It featured a young boy, Ash Ketchum, who embarked on a Journey to capture Pokémon in a technology known as the “pokeball” through the direction of the Professor (A man who studies Pokémon). After the Pokémon is caught, the young boy (and the thousands of other Pokémon trainers) would aspire to train it to battle other Pokémon.

Shortly after the show caught on, Nintendo released Pokémon Red and Blue for the Gameboy Colour. These games became an absolute hit. I remember walking to school with my eyes glued to my little pixelated screen—traversing over roads and dodging cars while battling with Pokémon and trading them with my other schoolyard peers. The games slogan repeating through my cranial, “Gotta Catch Them All”.

Nintendo continued to release Pokémon games designed for their various game platforms up until present. Each successive game included an obsessive and nostalgic excitement that took over the gaming community. Or anyone who had grown up playing Pokemon Red and Blue, as well as collecting the Pokemon cards.

Pokémon Go is a game that can be played on a mobile smart phone that uses geolocational data and mapping technologies that turn the phone into a lens peering into the Pokémon world.  Through the interface of your mobile device, you can catch Pokémon wandering the “real” world, battle through gyms, and find items that will aid your journey. It augments the world around the user so that everything and everywhere becomes a part of the game.

Just like its predessor, a game known as Ingress, many of the geo features in the game were set up around important places: art exhibits, cultural or historical sites, and parks. Following the maps would lead you through a productive tour of a cities geographical culture.

I want to explore the obsessive and nostalgic excitement through a techno-socio-cultural lens. I will unpack this critique into three parts: (1) the sociology of privacy, (2) Big data and algorithmic surveillance, and (3) the culture of nostalgia and the digital sublime.

Before I continue with this post—I want to assert that it is not an all-in-all terrible, megalomaniac, Big Brother type game. Pokémon Go is enabling new ways for people to engage in the social world. Check out this sociological blog post exploring just that. However, it would be silly to not apply a critical perspective to this.

13814436_1773471672910323_275681057_n
Taken from Facebook Page: https://www.facebook.com/wokemon/?fref=ts

There are some restrictions I’d like to apply to my analysis: (1) Pokémon Go is not an immature or irrelevant activity, millions of people of all ages and cultural backgrounds are playing it—meaning it has a ton of significance. As well as, (2) The people playing Pokémon Go are not zombies or passive consumers, they are very intentional and unpredictable social actors that have the ability to understand their situation.

Sociology of Privacy

One thing that boggles the minds of surveillance studies scholars is how the vast population of people using social media and mobile applications do not care about invasive surveillance embedded in everything they use.

In my own interviews of Facebook users in 2014, many of my participants claimed, “I have nothing to hide”. A pervasive mentality that enables big corporate and governmental entities to gain access and control to large swaths of data. This nonchalant attitude towards surveillance allows for massive ground in the dismantling of our rights to privacy. Though such an attitude is not surprising, as the entire ecosystem of social media is set up to surveil.

David Lyon, in his book “Surveillance After Snowden”, asserts that privacy is generally seen as a natural and democratic right that should be afforded to all citizens—but admits that a problem lay in that informational privacy is not as valued as bodily or territorial privacy. Even if information, data, and metadata are much more revealing than the both bodily and territorial surveillance.

Lyon notes three important points about privacy that are all very relevant to the current epidemic of pokemania: 1) the collecting of information has now been directly connected to risk mitigation and national security, implying that we are not safe unless we are surveilled. 2) Everyone is now a target of mass surveillance, not just the criminal. 3) Data collected through mass surveillance is made to create profiles of people—these may be completely inaccurate depending on the data collected, but you will never know the difference.

I would like to add a fourth. How can the data be used to swing massive profits? The corporation Niantic, creators of Ingress and Pokémon Go, use their privacy policies to legitimate “sharing” (sic: selling) of data with governments and third party groups. Government surveillance is often the focus of criticism. However, capitalist corporations are not often held accountable to ethical practices. Who is selling this data? Who is buying this data? And what is this monetized data being used for?

As Lyon asserts, Privacy is not about individual concerns—it is important socially and politically for a well-balanced democracy. Edward Snowden has been known to say, “It’s not really about surveillance, it’s about democracy”. While we continue to allow powerful groups to chip away at our privacy for entertainment, we literally give up our ability to criticize and challenge injustice.

Snowden reminds us that when we give up our democracy to the control room—there is zero accountability, zero transparency, and decisions are made without any democratic process.

So while we are distracted trying to catch a Snorlax at the park, we are giving away more and more of our lives to mysterious and complicated groups that want nothing but large profits and control. For a much more scathing review of this, see this blog post on surveillance and Pokémon.

Big Data and Algorithms

So what about the data. What is big data? First off, it’s all the craze right now. As data scientists, social scientists, policy makers, and business gurus scramble to understand how to use, abuse, and criticise such a thing. Big data is consistent of two large disciplines—statistics and computer science. It is the collection and analysis of unthinkably large amounts of aggregated data that is collected and analyzed largely by computer software and algorithms.

Boyd and Crawford (2012) offer a much more precise definition. They assert that Big Data is a “cultural, technological, and scholarly phenomenon” that can be broken into three interconnected features:

  1. Technology – Computer science, large servers, and complicated algorithms.
  2. Analysis – Using large data-sets compiled from technological techniques to create social, political, cultural and legal claims.
  3. Mythology – Widespread belief of the power of Big Data to offer a superior knowledge that carries immense predictive value.

The big problem that remains is how to find, generate, and collect all of this data? In terms of social media and video games much of this has to do with offering a “free” service to consumers who take the role of the “prosumer”. The prosumer is a social actor that both produces and consumes the commodity they are “paying” for.

In terms of social media (like Facebook), while users interact with each other, they are producing affective or emotional data through liking things, sharing things, and discussing things, that are then collected by algorithms and fed back into the system through targeting advertisements. The user is implicit in both the production and consumption of that data.

The user is given free access to the social media platform, however, they pay for it through giving the platform a transparent window into their lives that is than monetized and sold for large profits. People’s reactions to this form of surveillance are variant: some people offer scathing criticisms, others don’t give two shits, and some act a little more cautious.

Why is this important for Pokémon Go? Because you trade your data and privacy for access to what Pokémon Go has to offer. It is incredibly clever of think tanks in Niantic—using the nostalgic Pokemania to usher users into consenting to ridiculous surveillance techniques.

It gets worse. As Ashley Feinberg from Gawker identified, the people responsible for Niantic have some shady connections to the international intelligence community. Causing some in the surveillance studies field to fear that Pokémon might just be an international intelligence conspiracy (It sounds crazy—but it makes complete sense).

David Murakami Wood coined to the concept of “vanishing surveillance”. This is a phenomenon, intentional and unintentional, that allows surveillance capacities in devices to fade into the background. Resulting in users not being aware, or at least completely aware, that they are being watched. Pokémon Go, an innocent video game that is enabling new ways of being social in public, becomes an invisible surveillance device that may have international and interpersonal consequences. And it is the Pokémon themselves that allow for the surveillance to vanish from sight and mind.

A Culture of Nostalgia

gameboypokemon

So what drives people to consent to all of this? What kinds of cultural patterns allow and shape us to an almost fanatical state when a Pokémon game is released?

The first factor within the culture of Pokémon is its appeal to nostalgia. Jared Miracle, in a blog post on The Geek Anthropologist, talks about the power of nostalgia. It taps into the childhoods of an entire generation—it even moves outside the obscure boundaries of gamer culture into the larger pop cultural context. It wasn’t only geeks that played Pokémon. It was just about everyone. This might provide an explanation to why so many people are wandering around with their cell phones before them (I’ve seen them wandering around Queen’s campus today, while I was also wandering around).

However, it is not all about nostalgia. I believe that the nostalgia plays a role in a bigger process of the digital sublime and the mythologizing of the power of media.

What is a mythology? According to Vincent Mosco, in his book The Digital Sublime, defines myth as, “stories that animate individuals and societies by providing paths to transcendence that lift people out of the banality of everyday life”. This is a form of reality that represent how people see the world from the every-day-life perspective.

Myths are also implicit in power. “’Myth’ is not merely an anthropological term that one might equate with human values. It is also a political term that inflects human values with ideology… Myths sustain themselves when they are embraced by power, as when legitimate figures… tell them and, in doing so, keep them alive”.

These myths, along with nostalgia for Pokémon paraphernalia, generate the digital sublime. A phenomenon that has us go head over heals for new technology. The mythologies that support it can be positive or negative.

Positive mythologies might sound a little like this: “Pokémon Go is allowing us to leave our homes and experience the world! We meet new people and we are empowered by new ways of interacting with each other. Hurrah!”.

Negative Mythologies are also important: “Pokémon Go is creating a generation of zombies. People are wasting their time catching those stupid Pokémon. They are blindly and dangerously wandering around, falling off cliffs, and invading private property. Damn those immature assholes”.

Both of these mythologies cross over each other to colour the experiences of those who play and those who watch.

We need to be careful of generating mythologies about the capacity for games to facilitate freedom, creativity, and sociality. We also need to be careful not to apply to much criticism. Such mythologies not only create a basic, overly simplistic way of understanding gaming, surveillance, and human culture, it also blinds us to nuance and detail that may be important in its broad understanding.

So while people dangerously block a highway to catch rare Pokémon, walk over cliffs because they aren’t paying attention, or disrespectfully attempting to catch Pokémon in Auschwitz, there are also people who are leaving their houses to engage with the world, using Pokémon to fight depression and other mental illnesses, and creating super cool maps of rare Pokémon spots.

Drawing things together—A Political Economy of Pokémon

200
Don’t be so paranoid.

Taking a techno-socio-cultural perspective allows us to engage with Pokémon Go with a nuanced understanding of its positive and negative characteristics. It is possible to look at how this media creates a complex ecosystem of social concerns, political controversies, and cultural engagements with nostalgia, mythologizing, and capitalist enterprise.

Pokémon Go is indeed enabling a ton of new ways of interacting and helping people with mental illness get out of their homes to experience the world—however, we can’t forget that it is also an advance technology developed by those who have interest in money and power.

Regardless of the benefits that are emerging from use of this application, there are still important questions about privacy and the collection and use of Big Data.

So Pokemon Go isn’t just enabling new ways of being social with the larger world. It is enabling new ways of engaging with issues of surveillance, neo-liberal capitalism, and social control through the least expected avenues.

After all of these problematics become more and more public—will we still trade off our freedom for entertainment?

Oligoptica: Why Surveillance Isn’t Perfect?

We have all likely heard of the panopticon. An architectural design of a prison, thought up by Jeremy Bentham, that was suppose to maximize surveillance capacities so that prisoners always felt as if they were being watched. Even when they weren’t. It consisted of prisons revolving around a central guard tower that could watch every move of every prisoner, all the time. However, the guard tower is made to be opaque—so the prisoners can’t watch the guards.

panopticon

In 1975, Foucault borrowed this idea to illustrate his concept of disciplinary power in one of his most famous books—Discipline and Punish. The basic idea around Foucault’s use of the panopticon is that when people feel as if they are constantly being watched, they begin to self-discipline. The panopticon can refer to a prison. But it is meant to refer to society in general. Or many of the institutions in a society. The more people feel that they are being watched, the better they act. This watching could be through authorities, or even, your neighbors.

Though Foucault’s concept of disciplinary power is super important to many who study sociological theory—his example of the panopticon is overused and often misleading. It does not accurately represent the nature of surveillance in contemporary society.

The idea of the panopticon better characterizes a society of “total surveillance”. A completely, balls-to-the-walls, 1984, Big Brother-type (dys)utopia. Thankfully, there is currently no technology on earth that can allow for total surveillance.  We may be a society of ubiquitous surveillance, but not a society of total surveillance.

So how do we “move beyond the panopticon”, as so many social and cultural theorists have been calling for? There is one useful theoretical framework that builds on Foucault’s work. This is the concept of the oliopticon. A concept that was proposed by Bruno Latour during his incredibly critical arguments of Reassembling the Social.

Latour criticizes Foucault for drawing up a total surveillance “utopia” that is made of “total paranoia and total megalomania”.

giphy

He writes,

“We, however, are not looking for utopia but for places on earth that are fully assignable. Oligoptica are just those sites since they do exactly the opposite of panoptica: they see much too little to feed the megalomania of the inspector or the paranoia of the inspected, but what they see, they see well…”

 

Latour is staunchly reminding us that something that is everything is nothing at all. The panopticon is made to be too perfect. It is made to see all. It’s something, that as academics, we can’t possibly empirically record or understand. But the oligopticon is the existence of countless scopes meant for watching. Countless surveillance devices. They only see everything together, but because they rarely communicate it could hardly be called “total surveillance”.

Latour continues,

“From oligoptica, sturdy but extremely narrow views of the (connected) whole are made possible—as long as connections hold. Nothing it seems can threaten the absolutist gaze of the panoptica, and this is why they are loved so much by the sociologist who dream to occupy the center of Bentham’s prison: the tiniest bug can blind oligoptica”.

However, this does not entirely rule out the panopticon. As Kitchin and Dodge in their book Code/Space assert, the power of codes and algorithms may some day be able to unite many of the streams of the oligoptica to create a menacing panoptic machine. However, due to the unstable nature of the practice of scripting code, running code, and working hardware—it is liable to bugs, errors, and absolute mutiny. So don’t hold your breath.

The panopticon, for now, has its place—but it’s a more appropriate theme for a science fiction novel than a good work of social science or philosophy. It serves as a powerful reminder of where a ubiquitous surveillance society could lead us, but not as a very good characterization of surveillance today.

The Slender Man, Legends and Cultural Anxieties

Surveillance is being called ubiquitous by most of the leading scholars who study the social, political, and cultural ramifications of surveillance technology. A focus that I have been studying and thinking about is how surveillance is understood by everyday people living everyday lives.

I do this through the lens of Folklore, the study of everyday life. Or the study of the Folk (lay-person). This is obviously problematic—as such a term equates everyday life with peasantry. So for the remainder of this post I will use the term vernacular performance (i.e. everyday performance).

I’ve written about this work in the past. One of the ways that we demonstrate our cultural anxieties and fears is through the collective performance of legend cycles. In this case, I am speaking about the boogieman of the Internet—the Slender Man.

tumblr_n1ifr0KqIz1toa6e6o1_500

What is a legend?

Legends are repetitive and variant. Meaning people tell it over and over again, and as it is told and spread it changes form while keeping a central theme. Legends are a performance between storyteller and audience. Meaning that people perform legend cycles. A teller typically recounts a story to a listener or audience. This does include digital legends. Finally, Legends are not constructed by the teller, but by the community. The interaction between the storyteller and the audience constructs the story and allows it to spread. It is a collective process.

The Slender Man is a creature born the performative interactions of a group of users on the forum Something Awful. The Slender Man is a tall, monstrous figure. One that resembles a tall man in a black suit. He has no face, and extraordinarily long arms. He is sometimes depicted with many moving tentacles. All of this, and his many disproportions give it a Lovecraftian appearance. An eldritch monstrosity.

Cultural Monsters

As Tina Marie Boyer (2013) asserts in terms of the Slender Man, “a monster is a cultural construct” (246). And as such, understanding the ‘anatomy’ of a monster sheds light on the problems people face in their day-to-day existence.

tumblr_nrr2stfECA1uw7i1fo1_500

What is the anatomy of the Slender Man? I decided to do some ‘fieldwork’—exploring many of the blogs/vlogs that contributed to its legendary constitution. I found three major themes: Surveillance, Social Control, and Secret Agencies. This returns us to the topic of this blog post: The Slender Man is a vernacular performance that demonstrates our collective anxieties of a culture that is under the constant gaze of massive and complicated networks of surveillance.

Surveillance

The Slender Man is known to watch its prey. It is rarely confrontational, though it seems to relish in making its presence known. One scene that sticks out to me is from the YouTube series Marble Hornets—the main protagonist, after becoming increasingly paranoid of the faceless man in a business suit following him began to leave his camera running while he slept—only to discover that the slender man watches from a crack in his door while he sleeps. The Slender Man watches, seemingly from everywhere—but even when it is seen, the Slender Man has no eyes to watch from. It is as if it sees everything from nowhere. The Slender Man appears and vanishes, seemingly at will, haunting victims with little to no motive. The Slender Man represents the phenomenon of ubiquitous surveillance in the virtual world. A world where anonymity and pseudonymity are quickly disappearing. A world where only the experts understand what to surveil and how to read the data such surveillance produces. And a world haunted by faceless watchers.

Social Control

tumblr_nylbjo8sSi1umvov6o1_500

The Slender Man also represent themes of social control. The most obvious instance of this is the ‘proxies’, otherwise known as the ‘hallowed’. These are people who have been overcome by the Slender Man’s will. In many instances, the Slender Man legend telling ends in the main protagonists going mad and disappearing. They are either killed by the Slender Man (or its minions), disappear from time and space and sometimes memory, or are turned into a proxy. This means, they lose their minds and begin to do the bidding of the Slender Man. In the blog ‘Lost Within the Green Sky’, the main protagonist Danny describes it as a form of indoctrination that slowly drains the will from its victims. Even as a proxy, once their usefulness dries up – they are often killed. This theme is not surprising as it emerges from a cultural context that is known for its pervasive ability to control through silent software mediators.

Secret Agencies

The Slender Man is also known as The Operator (signified by a circle with an X through it). This name, along with the black suit it wears, makes the Slender Man a clear reference of Secret Agents. Those organizations who haunt the Internet, forcing those who wish to remain anonymous into the depths of TOR browsers and VPN applications. The Slender Man is representative of the NSA, FBI, CIA, CSIS, KGB and other notorious spy agencies operating with little oversight and behind a secretive veil. They are just as faceless as the Slender Man. And just as cryptic. Few understand the significance of their presence. And those who come under its haunting gaze have quite a lot to fear.

More Research

tumblr_nlmlhcF6Jt1tzlzbzo1_540

Folklore is a small branch of the social sciences.  There are few people who work beneath its flag. And fewer of those people study contemporary, digital folklore. However, this does not diminish its importance. Folklore offers us a lens to peer into how everyday people interpret the world through vernacular expression. It is an essential dimension to the surveillance studies canon. An understanding of how people interpret surveillance is essential if we are ever going to take action to educate people about its dangers.

‘Software Mediated Intimacy’ in Tinder: have we ever been in love?

Tinder has become an almost ubiquitous dating app that has built quite a lot of controversy even as it sits in most of our mobile phones.

While users swipe left and right searching for people they might be interested in, questions of legitimacy and authenticity in love, intimacy and dating arise. I was motivated to write this blogpost after a debate with a great friend of mine about the authenticity of love in Tinder.

Questions arose for me: If love is a social/cultural construct, is it authentic? What does Tinder do to change how love manifests itself? What has changed that we can’t right away see?

These tensions hang about within an atmosphere of heavy technical mediation that silently organizes, classifies, and structures how users interact with each other.

In an article by Fast Company, Austin Carr had the wonderful opportunity to interview Tinder CEO Sean Rad about the algorithm that measures user desirability and matches users to their equivalents. This is the ‘Elo score’ program.

Tinder users do not have access to their desirability rating, however, this rating does predetermine who gets to see who. It brings together a large and heterogeneous set of data on users in order to measure their desirability. Users only have access to viewing profiles that share a similar scoring.

Algorithms are active mediators in the construction of networks of intimacy. I would like to call this ‘software mediated intimacy’. Before we explore this concept we need to understand the basics and histories of two concepts: actor-networks and intimacy.

On actors, networks and the silent power of algorithms

Actor-networks are a set of conceptual tools originally devised and made popular by science and technology studies scholars Latour, Callon, and Law.

The most fundamental distinction of this school of thought, sometimes referred to as actor-network theory (ANT) is the principle of generalized symmetry. This concept holds that humans and nonhumans (atoms, systems, computers and codes) are equal in their ability to shape intentions of actors in networks of associations.

An actor, which can be human and/or nonhuman, is made to act.

This act is always done in relation to other acts.

So actors, together acting, create actor-networks. Humongous, unpredictable webs of associations.

This model of understanding society—through complex, overlapping networks of humans and nonhumans, is very useful for understanding how actors shape each other over social media platforms.

There are countless nonhuman’s working in sprawling actor-networks that shape, structure, and mediate how we interact with others online. Most of this remains hidden within our electronic devices. Strings of codes and wires affect us in ways that are sometimes more meaningful than our interactions with humans.

Some ANT scholars call this the blackbox—where a vast actor-network becomes so stable it is seen to act together and it becomes a node in other, more expansive actor-networks. Think about how many blackboxes exist in the many components of your mobile devices.

I would go as far to say that most Tinder users are unaware how much their romances are mediated by algorithms. These algorithms are typically trade secrets and the musings of Science and Technology scholars—not quite specters in the imaginations of users.

On the social construction of love and intimacy

Love is not universal, it is a complex set of changing associations and psycho-socio-technical feelings that are tied to space, time and historical circumstance.

Anthony Giddens, in his book The Transformation of Intimacy, demonstrates the social and cultural influences that actively construct how we understand, perceive, and pursue, love and intimacy.

In the past, the pre-modern, relationships were forged through arranged marriages that were mainly tied to economic priorities. This is something that Foucault, in his work The History of Sexuality, called the deployment of alliance—in other words, the creation of social bonds and kinship alliances that shaped the distribution of wealth and ensured group and individual survival.

In a complicated move that I am unable to detail here—the deployment of sexuality emerged that changed the very nature of the family. It also led to the love and intimacy that we know today—a social bond that is not necessarily tied to the family unit.

Gidden’s follows, as Western societies began to experience the characteristics of modernity—intimacy and love began to emerge alongside it. Romance emerged when the family institution of premodernity collapsed.

Why am I talking about this? I am talking about the work of Giddens and Foucault in order to illustrate the love and intimacy are a constantly shifting social and cultural construct. Every generation experiences love and intimacy differently.

This is probably the reason why your parents scowl about your one night stands and the fact that you haven’t given them grandchildren yet.

Love and intimacy is going through a massive shift right now.

‘Software mediated intimacy’—algorithmic love

In his essay A Collective of Humans and Nonhumans, Latour speaks about our progression into modernity as a deepening intimacy between humans and nonhumans. An integration between humans and their technology. A folding of all sorts of intentions and agencies.

In this case, as humans become further enveloped in their associations with mobile devices and ubiquitous social media. Most of our interactions become mediated through strings of codes, complex algorithms and digital hardware.

In the case of Tinder—the algorithms are mediating and structuring how we meet and communicate with people who may become love interests. Of course, these algorithms don’t exactly control such love. Human actors enter into associations with the Tinder actor-network with all kinds of intentions and expectations.

Some actors want a fling, a one night stand. Others might want a long-term relationship. Others may even just want attention or friendship.

So even though Tinder is sorting who gets to talk to who—a heterogeneous and constantly shifting clusters of intentions are constantly shaping how Tinder-based relationships will turn out.

And it’s not perfect. Love and intimacy falter and wilt just as much as it blossoms.

But that’s not my point. ‘Software mediated intimacy’, like all of the forms of before it, is another form of social and culturally curated love and intimacy. So it’s not authenticity that is the primary question here. All forms of love emerge from some degree of construction or performance. There is no universal standard.

However, what is different and what is often not seen is the degree to which love can be engineered. Computer scientists and engineers, at the beckoning of large (or small) corporations, embed particular logics and intentions into the algorithms they construct.

As Dodge and Kitchin remind us, in their book Code/Space, this is not a perfect product of social control—but a constant problem to be shaped. Even so, it is disconcerting that so many users are being shaped by silent human and nonhuman mediators.

Tinder’s algorithm and the ‘Elo Score’ is a trade secret. Not opened up to the public. Or the academe. So I am left scrambling for an answer that only raises more questions.

‘Software mediated intimacy’ can offer us a new and novel way to construct relationships, whether they are temporary or sustained. It is a form of social interaction that is just as authentic as prior forms of love and intimacy. However, the codes and algorithms and the complexities of computer science and statistics make it overwhelmingly easy for corporations to shape users based on private interests.

We must learn how these processes work and how they might be democratized so that the user may benefit as much as the capitalist who produces and hordes such software. We must embrace ‘software mediated intimacy’, in order to learn about it and master how it works so we might sidestep the potential and precarious exploitation seeping through our engineered social bonds.

Note: This blog post will be followed by a more in-depth analysis of this understanding of Intimacy. I am seeking to expand this into a larger, theoretical project. Any comments, criticism, or thoughts would be incredibly useful.

Affinity and Algorithm: A sociological review of Robert Charles Wilson’s The Affinities

Preface: Read the book first 😉

The Affinities by Robert Charles Wilson is a science/speculative fiction about a man named John Fisk, whom gets caught up in the whirlwind of a new, emerging social order.

The affinities, a group of 22 technologically and scientifically engineered social groups, were created by a man named Dr. Meir Klein, a social psychologist whom discovered the ‘socionome’ through a fictional scientific field call ‘teleodynamics’.

Klein names twenty two affinity groups, and they are all reserved only for those who are preselected to belong to such groups through rigorous psycho-socio-technical testing.

A collection of neurological, psychological, and sociological data are collected from social actors and entered into a computer program that has an advance algorithm that sorts patients into different affinity groups that are based on essential qualities of the individual that the actor has no control over.

anigif_optimized-28760-1427215790-25

John Fisk is chosen for Tau—one of the largest affinities.

Just as a precursor to this review: Wilson has embedded so many layers of academic and creative work into this novel.  I am only able to focus on a particular dimension—algorithmically engineered social groups.

Once a person joins an affinity, they get drawn in by some psycho-social force that creates bonds stronger than family and a fierce loyalty akin to wolves.

The book begins, following John who is suffering some sort of ambient existential crisis. This character does not belong to his family, his school, nor to his friendships. He is lacking an essential quality that most humans long for: belongingness.

Once he joins Tau and attends his first meeting, he is immediately hurled into a cult like loving embrace with the other Tau members. A sense of belonging emerges so strong that I felt myself becoming overwhelmed by a sense of nostalgia for something I’ve never experienced.

Here is the central problem: How do you engineer social relationships to address one of the most prevalent symptoms of late-modernity—loneliness and anomie.

dmmevjzhze0hu7xvpzwe

Wilson’s story explores the power of computer algorithms and social technology to augment and engineer human interaction. Such social engineering has the potential to create a perfect group dynamic powerful enough to accomplish anything. Including bringing about a sense of purpose that dissolves the problem of social anomie.

Though ‘teleodynamics’ and the ‘socionome’ are works of fiction, such algorithms already play such a large role in our lives.

These are the algorithms that organize and structure how we use most social media applications. Facebook, Instagram, Twitter and Tinder utilize complex algorithms that mediate, and sometimes exert control, over our interactions with each other.

This phenomenon is ubiquitous. It’s everywhere, and it’s usually disguised, and made invisibile, in complicated electronic devices.

It’s easy to get bedazzled by the spectacle of affinity groups. The beginning of the story leads us to follow lonely John Fisk get carried away by the social bonds created in Tau. He is no longer lonely. He has found a sense of belongingness that his broken family was unable to give him.

However, as the story progresses we get to see snippets of a dystopia burning away at the overall spectacle. Not everyone can get into an Affinity. After testing, many are turned away as their psycho-social profiles do not match any of the 22 affinity groups.

tumblr_inline_mxgznhv5Iz1ql4nrq

They are rendered outsiders. In a move that is reminiscent of eugenics—these outsiders are put into a seriously disadvantaged position as affinity groups ignore their concerns.

Twenty two utopias are formed and they close the door on anyone who doesn’t belong. Furthermore, the hyper-loyalty begins to make pan-affinity tensions as affinity groups begin to push against each other for supremacy.

We are left with a sense of anomie and a lack of belongingness. The very issue that the Affinity groups, constructed by Klein, were supposed to elevate.

There are many layers of inter-personal and inter-group conflict that emerge out of these tensions. The affinities are battling secret battles against each other, not so different from gangs; the politicians of the state oppose the fiery emergence of affinity based governance; and those who don’t belong are asserting their right to belong.

The affinities change everything from the most micro social interactions to the most macro global politics.

However, once this new form of algorithmically engineered social groups take root in the social and economic infrastructures—they are here to stay. Everything is different. Eventually the technology for testing people for affinity groups becomes affordable and public.

Coders and social scientists begin to experiment on alternative affinities. Or at the very least, innovative ways of using algorithms to structure relationships. And of course, like everything on the Internet—an open source version emerges to deal with all those who don’t belong. Everyone left behind.

This project is led by an organization called New Socionome. A decentralized and open-sourced activist group trying to create access to affinities for everyone.

In the end, the social landscape is changed for good and it is uncertain what the future will look like. This book is a useful tool and a powerful story that engages with the developing tensions surrounding emerging algorithmic relationships that are silently shaping the lives of millions.

tumblr_inline_mn4c1zu2EH1qz4rgp

Wilson’s work also explores the existential issues of belongingness in a society where family bonds are becoming fractured and falling into a state of anomie. This existential angst that can be elevated with technological assistance. Though he is careful to portray that such assistance is not a perfect, utopic solution.

Science and speculative fiction have an immense power to explore issues between humans and technology. The Affinities does a great job at this. Oftentimes, algorithms are silent mediators of our communication. So silent, that no one seems to notice their prevalence. It’s stories like this that draw attention to a problem that has existed for over a decade.