“Rhythm is a dancer
It’s a soul companion
You can feel it everywhere
Lift your hands and voices
Free your mind and join us
You can feel it in the air
Let it control you, hold you, mold you
Not the old, the new, touch it, taste it
Free your soul, let it invade you
Got to be what you wanna
If the groove don’t get you, the rhyme flow’s gonna
I’m serious as cancer when I say
Rhythm is a dancer”
Rhythm is not only in bad (but so bad that it’s good!) 1990s electronic dance music songs. Rhythm can help us think about what’s happening when media companies reconfigure our environments to influence how we behave and consequently think. I use rhythmedia, as a way to examine how people and territories are reconfigured, organised and disciplined in a recursive feedback loop. Rhythmedia is the way media practitioners’ conduct repetitious training on people through orchestrating the way they live in mediated territories. These practitioners conduct the way architectures change according to the knowledge they gain from (processed) listening to people’s behavior across multiple spaces. Both spatial and temporal orderings are in constant processes of (re)production that are influenced by the inputs (knowledge) that processed listening provides. Such a repetitive production of time and digital space is constrained by the media (measuring devices), and the intentions of the media practitioners. Once this knowledge is produced (collected and recorded according to specific measuring units) it is ordered (categorized, filtered and presented) in a particular way and that is where the concept rhythm comes into play.
Again, it is important to emphasize that I don’t only mean algorithmic ordering of time and space, but any mediated territory. In the 1st story about Bell Telephone, I show how the Noise Abatement Commission wanted to restructure New York City in a more efficient way. That meant enabling and legitimizing specific rhythms such as cars and big retail stores, while making activities like street commerce illegal.
If rhythmedia is how people and things are ordered in a particular time and space then personalization is a particular type of rhythmedia that social media companies push to us as the preferred way of living online. Taking Netflix as an example, the service argues that other important things they calculate to provide personalization is “the time of day you watch” and “how long you watch”. Like many other online publishers, advertising companies, services and publishers Netflix conducts hundreds experiments on us everyday, trying the best type of rhythmedia:
Traditionally, we collect a batch of data on how our members use the service. Then we run a new machine learning algorithm on this batch of data. Next we test this new algorithm against the current production system through an A/B test. An A/B test helps us see if the new algorithm is better than our current production system by trying it out on a random subset of members. Members in group A get the current production experience while members in group B get the new algorithm. If members in group B have higher engagement with Netflix, then we roll-out the new algorithm to the entire member populationNetflix Technology Blog
Similarly, explaining how their algorithm works to journalists Instagram argues that several time related signals determine what you see on the ‘feed’ such as: recency (how long ago a post was shared), frequency (how often do you open the app) and usage (how long do you spend on the app). Time is important but not neutral.
The Facebook Immune System algorithm is a great way to show how rhythmedia works – Facebook listens to people’s behaviors, whether they are considered positive or negative – Everything counts. In this way, the company establishes what types of rhythms can harm its business and thus should be decreased/removed/filtered as possible options of behaving on the platform. The platform only orchestrates people in the desired rhythm, so although all actions count – only the behaviors that are valuable (such as more emotional or repetitive) will be ordered whilst the ‘negative’ ones (such as creepers) will be filtered out. In this way, the company establishes what counts as engagement, what type of sociality is worth more.
We can see a more recent example with TikTok, where Joseph Cox a journalist from Motherboard revealed TikTok’s moderation regulation. In other words – Here TikTok wants to remove people and environments who deviate from the norm they established – fat, ugly, old and wrinkled ppl and places like slums – are all categorized as antisocial and filtered out of the platform.