in defence of naps
a ux perspective on time screen control apps & the reinforcement of regimes of self control, and an exploration of re-evaluating the way we rest as an alternative
Thank you for 100 subscribers! This newsletter has already brought me so much, and the fact that 100 people want to read my essays is crazy to me.
Also, warning: this essay gets quite leftist quite quickly. It does, however present a solution you can enact by yourself, with only a spare afternoon and a pen and paper.
If you like what I write, please consider becoming a paid subscriber, so I have more time to spend on this newsletter! I work full time and freelance alongside this Substack, so that doesn’t leave much time for writing, but I’d love to change that.
Apps that limit screen time contribute to and create a regime of self-regulation that isn’t good for anyone. They come from a desire for self-regulation instead of collective change. They serve an emotional purpose of making you feel like you are doing something about things in your life that feel out of control—without actually doing anything about those things.
They aren’t meant to tackle your relationship with your devices but to make you feel good about yourself. Your ability to resist those pesky little emotions that drive you to check your phone constantly is a virtue that places you above all others, according to society. But what if this isn’t the best solution? What if it’s a solution rooted in societally conditioned beliefs about the virtue of self-regulation?
The people who create these apps don’t want to “fix” your relationship with social media. Because if they did, you wouldn’t need them and wouldn’t be paying them money. Screen time apps have absolutely zero motivation to help you induce long-term behaviour change - but they do have the motivation to create an emotional reliance on them. This they do by making you feel good about yourself, reinforcing beliefs about the virtues of self-control.
Self-regulation under neoliberal capitalism has a long and tested history, and in computer science this is often represented through the overconfident belief in the computational theory of mind, which I have spoken about before. This is the (false) idea that our brains are essentially input-output machines that work like computers. If this theory is believed, then the idea of manipulating ourselves into spending less time on social media apps through yet more digital tools makes sense. But when you examine how these apps work and their broader impacts, you come to question this. What if:
These apps have more negative impacts than good ones and do not induce long-term behaviour change like they claim to.
You don’t need to be miserable. There are better solutions.
This isn’t to say that we don’t have problematic relationships with our devices but to point out that excessive self-regulation keeps us distracted from the real problems and can limit the possibility of collective change & a better digital future. Surely someone can come up with something better than the endless self-regulation that is thrust at us from all directions? Surely, we can choose to be optimistic about technology, humanity and our collective future?
a ux analysis
And let’s be honest—they don’t even do a good job of keeping you off your phone. We’ve all downloaded one of these apps only to constantly think about how much time we have left on social media for the day. The UX of these screen time apps often has many standard features—personalisation, feedback loops, non-intrusive reminders, and gamification. An analysis of these from a leftist perspective and an evidence-based exploration of how long-term behaviour change is induced can explain why they are ineffective.
Screen time apps often offer lots of personalisation. The ease of use and customisation provide a sense of individual empowerment, but this is to enable a form of self-surveillance dictated by capitalist needs for efficiency and productivity. The personalisation is one mechanism that allows us to ignore the reality of the illusion of choice that is presented to us through marketing, and other capitalist propaganda. People are used to personalisation - and UX wise, it is hard to get right. This often means that screen time apps over-hype their personalisation features, and that, not only do they serve to create a fake sense of individual empowerment, they also don’t work.
Feedback and alerts create a feedback loops. Feedback loops are argued to create behaviour change by providing people with real-time reminders about their behaviour. In the case of screen-time apps, reminders about their time spent on social media. This, however, can reinforce the notion that time spent on leisure, social interaction, or digital engagement must be carefully regulated. Users are constantly reminded to optimise their behaviour, measured against standards of productivity that serve capitalist interests rather than genuine human needs. Non-intrusive reminders present an interesting contradiction: they aim to regulate behaviour without feeling oppressive. While they don't directly interrupt, they still nudge individuals toward self-regulation, reinforcing that users should continuously monitor and adjust their behaviour.
Many of these apps often use gamification - where users are enticed to keep using the app and reach their screen time goals by watering a virtual plant or competing with their friends. This may seem innocent initially, but gamification and positive reinforcement introduce a dynamic where self-regulation is rewarded extrinsically. It turns the management of one's alienation into a game, making the process of regulating oneself feel enjoyable or even necessary, which distracts from the more significant issues of alienation and exploitation. This is also simply not an effective method of motivation - many studies explore the limitations of this kind of extrinsic motivation in inducing behavioural change.
Observing how the main features of screen time apps reinforce self-regulation, alongside the evidence they are not effective in long-term behaviour change, supports my conclusion that these apps aren’t meant to improve how you feel about your devices. The evidence clearly shows that these apps don’t induce long term behaviour change, and, like many evidence based conclusions about technology, a leftist analysis provides an explanation for why this is.
education & imagination: a way forward
My educational project that spans this Substack, my Tiktok account, speaking engagements & consultancy work is rooted in the idea that the solution to this problem lies in understanding why you are drawn into your phone in the first place. That’s what I call social media literacy - media literacy that also includes education on the structure of social media platforms, how they are designed and how this impacts user behaviour, particularly in the case of young people. The solution here isn’t a restriction; it is education.
I’ve discussed all this before, but it bears repeating in the discussion of the proliferation of screen time restriction apps.
This study suggests exactly this—that tools designed to improve agency on social media are far more effective than those that claim to manage screen time. The study discusses how habits that aim to reduce screen time tend to decrease in effectiveness over time, whether the habit is regularly deleting apps or using a screen time management tool. It also discusses how interventions designed to increase agency are more effective over time.
Reframing the role of rest in our lives is crucial in reframing our relationship with our devices. Think about it - what are you looking for when you go on your phone to doomscroll? Typically, the answer to this would be escape, rest, or some other way of turning off for a bit. Suppose we oppose restrictive, neoliberal approaches to dealing with this. In that case, we have to ask ourselves - what can we add to our lives that will provide an alternative to what we seek through overusing social media? The key to understanding social media lies in understanding why we are drawn to it - this involves learning about persuasive design and self-reflection.
The impacts of social media on mental health appear to be quite vague and difficult to pin down. I completed my final year university project on the impact of social media on teenagers and the potential of UX education as an intervention tool. During this, I spoke to several headteachers about their experiences with pupils’ use of social media. One of the things that came up consistently was that they observed social media intensifying existing social problems, such as anxiety, self-worth issues, attention span difficulties, and pupil fallouts. Research suggests that social media amplifies teens' fitness habits—those interested in sports and exercise benefit from related online content, while others typically don't engage. My research suggests this could extend to unhealthy coping mechanisms, too. Teens prone to self-isolation when troubled find further enablement through social media. Similarly, those inclined to bully others when upset find additional avenues online. This amplification effect could explain the vagueness in much research about social media's impacts—if it's an amplification tool, its effects can vary widely from person to person.
So, we are drawn to social media because of its persuasive design. Still, its impact on each person varies depending on environmental and social factors, the individual’s personality and other potential stressors in their life. It is important to reframe unhealthy coping mechanisms by first recognising that they do serve the purpose of meeting an unmet need and identifying what that need is, as a better approach than simply restricting yourself. Applying this to social media, we realise that with such individualised impacts, personal reflection is one of the most essential parts of understanding your relationship with social media. To do this, you have to ask yourself a couple of questions:
When I use social media & don’t feel good afterwards, what am I trying to fix by using it?
What can I add to my life that meets this unmet need?
Loved this! I’ve been on my own journey with screen time + app usage generally. I use an open source app called Freedom which isn’t a screen time app, it lets you block apps/sites on your phone, making it a lot easier to dumbify my phone which I find useful taking away the temptation for bedtime scrolling etc.
But what has been most useful is trying to use my phone in line with my values & taking stock of the effects it has on me, because as an ADHDer I’m particularly vulnerable to big tech tactics. So jumping through the hoops needed to protect any semblance of data privacy on Meta products has icked me out so much I naturally use Instagram less, turning off notifications and badges means that when a new app slips through and has them it feels so intrusive that 50% of the time I delete it, turning off autoplay videos so my attention isn’t captured without my consent by some janky news site that autoplays stupid clips, putting my phone in grayscale so my attention can’t be tricked with bright colours etc. it’s turning off the noise that has reframed my relationship, not some impossible quest to overcome the fact that these apps are designed to waste my time to sell me stuff and to sell my data
Oddly enough, your conclusion is very similar to Nir Eyal's in Indistractible. Very interesting discussion he had with Johan Hari: https://neilscott.substack.com/p/reacting-to-nir-eyal-vs-johann-hari