Everybody is talking about how devices/social media/screens/the internet/smartphones are bad and cause harm. As you know, I think the subject is nuanced and that we will all do a better job of keeping ourselves and our loved ones safe if we understand a bit more about the specifics of the harm - what it could be and how it might arise. This newsletter highlights one such thing. Understanding this thing will help us be more intentional and more aware of what is happening when we (and our kids) spend time on the internet, in whichever way we do it. The thing is the algorithm and how it can interact with anxiety.
I’ve given up trying to integrate the two portions of this newsletter so please enjoy “Chatty Bit” followed by “Psychology Bit”.
Chatty Bit
An algorithm is a set of data-based rules that govern how information is filtered and selected and presented. Ish. I’m no computer expert1 so I’m not your best source of technical information about how an algorithm works. Forgive me, I am going to simplify something very complex by using the term “The Algorithm” to symbolise the nebulous, tentacled, personal-data-driven process by which things which you haven’t specifically asked to see are served up to you via the internet. It is “The Algorithm” that governs your targeted ads, chooses the ‘suggested for you’ things that appear midway on your social media feeds, and populates the For You Page, Home Page or Search pages on the various apps and browsers you use. If you are very computer savvy it might annoy you that I have oversimplified in this way. I’m also going to say anthropomorphic things like “the algorithm now thinks that ….” which technically is incorrect because algorithms can’t think2 so again, if this annoys you, you might need to go somewhere else. Please do feel free to let me know how annoying it is by commenting below, because the more you do that, the more the algorithm tells people about me3.
I generally focus on how the algorithm issue affects children because that is my job but parents often say, after we have spoken about it, that they now think differently about their own use of the internet too. Recognising and understanding the insiduous and impactful role of The Algorithim should be a key part of living alongside the internet, an important bit of 'road safety' .
I was thinking about this a bit more when my pal Gemma told me about an event called How To Cure Your Algorithm to which I popped along one evening a few weeks ago. The first bit of the event, ahead of the Q&A session I linked to, was an interactive theatre thing led by two very likeable and charismatic people and it involved us audience members participating in activities and discussion about device use and doomscrolling4.
I found it quite uncomfortable that so much of the chatter in our discussions seemed to cede to a sense of powerlessness when really…..We’re adults! We can make good choices! I worry sometimes that we have rolled over and believed we are helpless and isn’t it terrible and oh-now-I-have-to-lock-my-phone-away-in-a-safe-please-will-somebody-sell-me-a-thing-that-blocks-my-internet-and-maybe-someone-else-can-sell-me-a-phone-that-looks-good-but-has-no-smart-features-until-someone-else-more-powerful-than-me-takes-the-internet-away-from-us-for-our-own-good. Did we forget that we can be in charge? That we can make thousands of tiny decisions every day about how we interact with the internet? It doesn’t solve everything and we still need to fight some big fights but we can and should make little choices that change things. Like “stop looking at that thing that upsets me”, “put my phone down it is 11pm and I’m tired”, “drink some water5”. I think that a bit more knowledge makes us a lot more powerful. That there are things we can understand about ourselves and the internet. That you can, right now, look at your phone and tell yourself you are not in fact a big helpless baby but someone who can get wise and be wise.
The generally likeable vibe of the Interactive Theatre Thing was popped for me firstly by someone sternly mansplaining to me about how Big Tech don’t actually want us to stop doomscrolling hence it is addictive and we must keep scrolling. Then another man lectured me about dopamine and he confidently let me know that he had learnt all about it from YouTube. I politely kept my irony to myself but at that point I turned my brain off and decided just to nod and smile instead. So I can’t tell you what else happened, I’m afraid. Oops. I’m sure it was all good though.
Psychology Bit
When I have been with children who are particularly worried or sad, I have noticed something. Some of what they were seeing on their devices were adding to their sadness or worry in an insidious way. I noticed when they explained more about their worried and sad feelings, they would mention things they had seen either on social media or on the internet in general. I’m not talking about the obvious stuff like violent or explicit material.
I’m talking about how their existing anxiety interacted with their algorithm to build and fix their worries into something overwhelming.
Here’s a highly fictional example6. Darcie is ten and everyone is worried about her because she seems very anxious, cannot sleep, and has lost her appetite. Darcie is allowed to use TikTok because she loves dancing and her parents understood she would like to see and learn all the TikTok dances. They trust her not to post her own videos and they check this by “following” her on their own TikTok accounts. Darcie has learned about inappropriate content at school and her parents know they can trust her not to do reckless things like give contact details to strangers or spend money on apps. They’ve set the parental control filters so she won’t be able to see bad language or violence or the like. So far so good.
Darcie’s grandad died last year shortly after being diagnosed with untreatable pancreatic cancer. The parent of a schoolmate died recently – Darcie isn’t sure how but thinks it may have been a heart attack. One of Darcie’s classmates recently told her about a scary-sounding disease spread by swimming pools and all the children in her class were thrilled/terrified about the idea it might be in the local pool. Darcie googled “can you catch hepatitis in a pool” and learned that it is possible. She googled “what is hepatitis” and was alarmed to see that it can be fatal, and early symptoms are loss of appetite and tummy pain. Darcie is only ten, does not understand relative risk, does not understand the great wealth of information about hepatitis and its treatment, and does not understand that there are hundreds of reasons why a child might have low appetite and a tummy ache. Darcie has a tummy ache now. She doesn’t feel like eating. Darcie is really worried she’s caught hepatitis. Darcie knows that diseases, like Grandad’s cancer, can be silent and unseen before suddenly striking you dead. Darcie doesn’t want to die. She searches for information about heart attacks too and finds it helpful that the familiar and comforting platform of TikTok has lots of videos for her about heart attacks. They are a bit easier to understand than the science website she looked at first. Did you know these six hidden signs you might be having a heart attack? Darcie does. In fact, she’s going to do a little bit more research by watching a few more of these videos…
The algorithim does not know or care that she is only ten. In fact, because she has TikTok, it thinks she is 13. When she is 15, will it think she is 18? What will it show her then? Now that she has told the algorithm she is not only interested in but captivated by health related content, it comes thick and fast. When Darcie is using the internet for homework research, her targeted ads feature private medical clinics, health insurance, little graphics of adults clutching their chests. When Darcie opens up the browser, the news widget skews towards health stories, and usually about cancers that started with one everyday symptom and were dismissed by doctors until they were terminal. Darcie can’t escape this. She’s only ten and she doesn’t know that this is a self-fulfilling prophecy, that it is only her internet that is like this. It seems that the world is full of deadly diseases, your body can’t be trusted, you must be vigiliant at all times to how it feels and how it has changed.
The internet didn’t cause Darcie’s anxiety. Anxiety is more complex than that. However, it did provide some fuel and some fodder, and it formed a significant part of the maintenance cycle. Darcie’s treatment would need to help her (and her parents) understand this, and to think about the role of her research behaviours, and how the algorithm “assisted” her anxiety. If she were to continue to use her devices, she and her parents together would need to re-feed her algorithm with sports videos and cat videos and pop songs and baking recipes. They would all need to stay vigiliant; not for sneaky heart attacks but for sneaky health information.
Whew. What a difficult situation. Would it be better to take Darcie’s devices away? It’s definitely one solution. However, whether you choose to ban devices or not, your children still need to learn about the algorithm7 because they are going to encounter it one day.
I recommend a three pronged approach; educate, communicate, and act.
Educate about what the algorithm is and how it works, explain how content creators make money and why they are incentivised by the alorithm to exaggerate and misinform. Demonstrate how your targeted ads or Google shopping pages change in line with things you search for elsewhere. Depending on your settings, you might be able to nudge your algorithm to show things that correspond to private messages you have sent elsewhere. You can show them in real time how YouTube autoplays content designed to capture your attention according to what you just watched. You can chat about the pros and cons - when is such a personalised service useful, and when might it harm you? Have they noticed patterns in what gets recommended to them?
Communicate by talking about how your algorithm affects you, comment on it when you use your devices, ensure it forms part of what you talk about when you talk about anything online. Make sure your children understand that you will always be more pleased they’ve been honest with you than cross about what they’ve seen. Be curious about what they are seeing and why.
Act in collaboration with them by nudging the algorithm towards appropriate content, swap out or ‘like-bomb’ certain types of content, change your settings around tracking and cookies and private browsing to limit how much information is collected - and of course ensure you have used as many parental controls as appropriate whislt understanding they don’t solve everything. Being able to see what your child has been searching for is helpful in understanding what the algorithm may be attempting to give them. In another newsletter, I’m going to write something more specifically about the algorithm as it relates to parenting content and how that affects parental mental health and behaviour, which will have more information about how to keep your secrets from the algorithm.
Some useful resources. Here is a webinar I did about this issue8. It is a couple of years old now so some specifics are a little dated (in a mere two years!) but the general gist is good, if you can stomach the amount of times I say ummmm. Here is a short video explainer about algorithms, suitable for children. Here is a nice summary page of tips for parents. This blog has got some nice explainers too, and a ton of links to other safer internet resources. I’ve tried to keep these recommendations trimmed to algorithm-specific content but I’ll make sure I link more resources in later newsletters that look at other aspects of social media/screens/devices/the internet. Below is a little screenshot from my webinar I linked above which sums up the algorithm stuff quite nicely.
Please add any other resources or tips in the comments, and let me know if there are specific issues you want me to cover. I’ve got drafts already on social media and parenting content, the other side of “smartphone addicition”, how to ‘fix it’ when your child has seen something bad, how to reduce device use, and more on specific risks and harms. And if no one sees this post, well we all know why don’t we9.
Really shocking I know.
YET!!!!
This is an ironic yet meaningful joke which underlines the fact that terrible and bad content gets outrage-fuelled promotion viaThe Algorithm. Hope you got it. Comment below if you did! Ha ha.
Obviously audience participation is v much not my comfort zone.
Ok I never do that one
As a practicing psychologist I can not / will not use details from real people. However, I can create fictional examples based on principles amalgamated over time.
They will of course be learning about what algorithms are in computer science but you know I am talking about The Algorithm.
I can’t remember why but it is split into three short videos, this is a link to the first one.
(Big Tech)
I loved this piece. I did have this conversation with somebody recently who was saying “why does Instagram only show us beautiful tall blonde women” and I had to break it to her that really that wasn’t what happened because that isn’t what it shows me…. and that really she was just revealing something about what she herself Paused over while scrolling. I think if people understood they can actually change their algorithm and recommended content they would feel better - and definitely it would be good to talk to kids about it. I feel I’ve failed to start a substantial traffic-building fight here