complain, profit, repeat: how your tweets spawn political bots and line zukerbergs pockets
a social analysis of political bots & the affective economy
By now, you've probably seen programmers and hardware specialists explain how Twitter bots work. We are moving past the age of bot farms in countries with low minimum wages and few labour protections towards an age of AI-powered bots. An age of using large language models to imitate people to the political ends of their mysterious masters.
While plenty of hardware explanations can be found, you have to dig a little to develop a social analysis of this phenomenon—there is more to it than meets the eye. This is too often the case with my work in ethical technology; focusing too much on hardware explanations distracts people from some of the biggest issues with the technology.
So - how does your unpaid labour lead to the success of these bots? And what can ideas of deceptive design tell us about the potential for regulating these bots? Software and design are not just technical; they are deeply ideological. Data, too, is imbued with ideology. The simple answer of' bot farms chatGPT rich Russians' does not fully capture the ideological underpinnings of these bots.
the economy of affect
I have recommended Emily Bootle's "This is Not Who I Am" as an interesting exploration of performative authenticity on social media. The book calls into question a cultural value of honesty that demands social media users to share their most private thoughts online in the name of "authenticity". While the book is an excellent analysis of why and how we feel the need to be "authentic" online, it ignores the profit motives behind performative authenticity.
So, how does performative authenticity make people money? Through data. By creating vast amounts of data, we create the products social media companies sell. Our affective labour creates this data, which then makes profit. Typically, we think of this in relation to targeted advertising - but there are other uses. Our social media posts about politics and the mundanity of life create the data that trains these political bots. They eat up this data and regurgitate it whichever way their masters wish to develop and twist the political narrative.
This concept gives rise to what can be termed 'the economy of affect '. In this economy, everyone except us profits from our emotions and how we express them.
You create a social media post about something going wrong in your life. Someone comments that the same thing happened to them. The algorithm files this into something that can be used for targeted advertising. A couple of hours later, you get an advert for a product that seems to solve this problem. A couple of hours after that, a political bot program gobbles up your data and replaces a couple of words here and there, then convinces everyone that it is a real person supporting a Trump presidency.
Your phone is listening to you! Sort of. It's important to make a couple of disclaimers here - thinking that your phone is literally listening to you is conspiracy thinking and should not be tolerated. Naomi Klein's recent book "Doppelgänger" highlights the importance of constantly rejecting this thinking, particularly on the left. She explores this through the descent into madness of Naomi Wolf (The Beauty Myth), for whom she is frequently mistaken, and how these slight exaggerations can slowly snowball over time. As such, when discussing issues like this, I encourage everyone to be careful about their language and not oversimplify.
software engineering as social engineering
This is where we come to the idea of software engineering as social engineering.
Big tech's worker bees create the technology that encourages us to share, share, share. Then we share, and they sell the data to whichever nefarious actor is trying to rig elections worldwide. They use this data to feed the social bots and allow the highest bidder to steer the political conversation toward bringing them more money and power. This cycle perpetuates a dangerous and unethical system where our personal information is exploited for the benefit of a few at the expense of the many.
It's crucial to understand that software engineering isn't just about creating tools; it's about shaping society. The intentional use of software to influence our social and political lives is a stark reality. Political bots, for instance, are designed to widen the political gap and create a distorted image of the 'political Other ', perpetuating division and misrepresentation.
This manipulation of technology for power and profit not only impacts our political landscape but also undermines the trust and integrity of our social fabric. We must critically examine the role of technology in our lives and demand transparency and ethical practices from those who create and control it. We must strive for a technological landscape that fosters genuine communication, collaboration, and progress rather than perpetuating division and deceit.
The software engineering that enables this isn't just engineering software - it is engineering the social.
deceptive design
First of all - the design. Since these bots don't have a "designed interface", as we typically think of it, it is easy to believe that they aren't designed - just programmed. But a lot more design goes into AI bots than people realise.
When designing a chatbot, for example, you essentially go through a character design process, which you then hand off to the engineers so they can base the machine learning programs on your "character." Political bots are no different, but the design is much more sinister than designing a character for an online chatbot. Deceptive Design is any design which is explicitly created to deceive users.
Twitter bots are expressly designed to be secretive—to make you think they are real people. This is where Deceptive Design comes in—the bots are designed as I describe, but the character created for them is deceptive—one trying to trick you into thinking it is a real person.
"Deceptive design patterns" are considered as interfaces and user journeys implemented on social media platforms that attempt to influence users into making unintended, unwilling and potentially harmful decisions, often toward a decision that is against the users' best interests and in favour of the social media platforms interests, regarding the processing of their personal data. Deceptive design patterns aim to influence users' behaviour and can hinder their ability to protect their data and make conscious choices effectively. Data protection authorities are responsible for sanctioning the use of deceptive design patterns if these breach GDPR requirements" — EU GUIDELINES.
Most countries have laws against this kind of design. However, it is typically thought of in terms of typical persuasive design, like how difficult it can be to remove yourself from a subscription service. But these bots are no different! They are, however, slightly harder to spot. And social media companies don't have much incentive to do anything about this - these bots make their money, increasing rage bait and engagement. This is particularly true when the owners of these social media companies are entirely off their rockets (cough, Elon Musk, cough).
don't forget about the planet
Of course, in our current climate, discussing this without considering its impact on the environment would be irresponsible. It's easy to think about "the cloud" in which all this data is held as an actual cloud - something immaterial. But it isn't - it's all hosted on massive server farms that use a ridiculous amount of electricity. So, our stolen data is not only destroying sensible political conversation but also the very world in which we live.
Book Recommendations & Other Links
Reverse Engineering Social Media - Robert Gehl
This Is Not Who I Am: Our Authenticity Obsession - Emily Bootle
Doppelganger: A Trip Into the Mirror World - Naomi Klien
What if we went beyond “demanding transparency from those who control” the platforms, and instead create our own systems? After all, facebook will never become a benevolent company: it is only operated for shareholder profit. We should really be investing in alternatives like mastodon that are open source, transparent, and not for profit