This distortion story focuses on Facebook’s strategies throughout the last decade to define, construct and manage deviant behaviors in order to reduce and eliminate them from the platform. Facebook enacts its power by engineering the social through 4 filtering mechanisms. These mechanisms are architecture, algorithms, hidden workers and their users – all trained to behave in the way Facebook configures as sociality. The two non-human filtering machines show how specific features of the architecture and algorithms harmonize people’s behavior and notion of time and space. The two human filtering mechanisms show how both its employees and users are trained to work by doing repetitive actions meant to tune the algorithms. These filters create a database that includes all knowledge about its users inside and outside its platform, and then renders only what it considers to be ‘social’ as possible options for living.
While some sections changed, others didn’t so much. The main thing was putting all of these pieces together. The overwhelming amount of information Facebook made available can give us some indication to how it operates. I also then realized that analyzing the research that its in-house researchers conduct can give me insight into their rationale, the tools they have available and what they focus on. I developed a method I call platform reverse engineering. I refer to reverse-engineering of software metaphorically, and read Facebook’s research articles to reveal software features, measuring and storing techniques, and design rationale (for example, how showing specific metrics are meant to drive more engagement as Benjamin Grosser‘s artwork below shows) – to analyze and identify its components and functions. I analyze these articles by searching for specific information that can reveal the way the platform develops its functions, including its algorithms and interface design. Facebook’s research started in 2009, and its archive consists of over 500 articles (https://research.facebook.com/publications/). This archive can also shed light on the motives, interests and rationale that stand behind the company’s actions.
One of the “good” things that happened with Cambridge Analytica scandal that was revealed first in 2015 on the Guardian was that it helped people understand the power Facebook has to shape sociality. The bad thing was that people only associated the problem with politics with a big P, the one that is happening in governments and policy making. What I’m trying to show in this chapter, which others have also emphasized such as journalist Julia Angwin, is that this applies to politics with a small p – our everyday lives.
In the new work I’m doing these days I’m tuning in more into this ideology of personalization that Facebook tries to sell us all the time. Personalization is a complex interplay influenced by people’s individual online behavior and their social networks, tuning in into micro and macro behaviors. If we take anything from the Cambridge Analytica’s scandal, it is that individuals’ friends and networks matter – personalization is dependent on social networks. As Theresa Hong that worked for Trump’s digital campaign Project Alamo said in the Secrets of Silicon Valley documentary, Cambridge Analytica targeted people according to what they call ‘universe’, meaning they were looking at: when was the last time people voted, who did they vote for, what type of car do they drive, and importantly – what kinds of things do they look at when on the internet. Furthermore, such companies also look at things like emotional and personality traits such as whether people are introverts, fearful or positive. This is precisely why I’ve developed the concept of rhythmedia, because as many companies show – They understand people as rhythms and orchestrate them accordingly.