Colman Noctor: Helping young people deal with online issues like bullying

Multiple factors make the online world a breeding ground for abhorrent behaviour
Body shaming, bullying, and racism are the most common forms of harm witnessed online, according to a new Spunout survey of young people in Ireland. Other harmful online exchanges included misogyny, homophobia, transphobia, fatphobia, and xenophobia.
While these findings are concerning, they are not surprising as people say things online that they would never say face-to-face. This is partly due to a process known as ‘online disinhibition’, described by US psychologist John Suler and other behavioural experts investigating online harassment behaviours such as trolling and cyberbullying.
Multiple factors make the online world a breeding ground for this abhorrent behaviour, the primary one being anonymity. When people believe they cannot be easily identified or held accountable, they feel freer to express negative or hostile thoughts. There is also ‘deindividuation’, or a loss of self-awareness and accountability, because people are hidden in a crowd.
In everyday social interactions, we rely heavily on nonverbal communication, such as body language and facial expressions, to gauge the impact of our words. These cues are largely absent online as we cannot see the person on the receiving end of the communication, paving the way for us to misinterpret others or feel detached from the emotional impact of their comments. This ‘empathy gap’ makes it easier to dehumanise people in online communications. ‘Dissociative imagination’ can also occur when people see their online personas as separate from their ‘real’ selves, allowing them to act in ways they wouldn’t in person.
These attempts to explain the prolific nature of damaging online behaviour are by no means an excuse — people need to be held accountable for their actions. However, online platforms must also own up to their role as facilitators of this behaviour. Some social media platforms have features that amplify toxic behaviour, known as ‘engaging is enraging’. Algorithms prioritise content that triggers emotional responses, including anger, frustration, or outrage, because it leads to more engagement. For example, If you are looking at some pretty flowers in a park and a couple starts to argue loudly behind you, it’s human nature that your attention will be diverted from the flowers to the heated argument. Social media companies recognise that attention is drawn towards conflict and use this knowledge to hack users’ attention.
Bystander intervention
Bullying is a significant public health concern, given its potential to damage the victim psychologically. It’s commonly reported that bystanders are present 80-85% of the time when bullying occurs in real life or online (social media, text message, gaming, and apps). Researchers say they can make a significant positive difference by defending the victim.
We typically expect those who witness others being mistreated to be upstanding and intervene on the side of the mistreated person. However, this is easier said than done, and the findings in the Spunout study confirm this concern. More than four in 10 respondents (42%) said they wouldn’t intervene when witnessing harmful behaviour online because they don’t know how to respond. In contrast, 42% said they worried they might become a target because they stood up to perpetrators of harm online.
When asked to describe the most prominent challenges young people face online, exclusion from groups or group chats was the highest (48%), followed by mean or harassing public posts (23%) and mean or harassing DMs (18%). They also reported constant negative news, comparing themselves to others, misinformation, addiction to online and doom scrolling as the common challenges they faced online. I am unsurprised that these are the top three issues, as they are all too common themes in my therapy room.

Exclusion from groups or group chats, a seemingly innocuous teenage dynamic, can devastate some young people’s mood and self-worth. The impact of being excluded from a friend group is not to be understated. I have seen such incidents result in young people becoming severely depressed, highly anxious, and even engaging in self-harming behaviour. The problem of exclusion has become one of the most significant challenges to young people’s mental health and one of the most complex dynamics to manage. Identifying the perpetrators of exclusion and even defining certain behaviours as exclusionary is very difficult. We need to be creative and proactive about how best to respond to this phenomenon due to the concerning level of emotional upset it is causing.
Spunout’s study found that 65% reported feeling anxious or depressed, 42% felt embarrassed, and 40% felt unsafe when they received hateful comments online. When asked why they believed some people engaged in this vicious behaviour, they said insecurities, lack of consequences afforded by anonymity, jealousy, and boredom were why those who perpetrate harm online do so.
Empathy online
Spunout’s survey of 1,355 young people has informed the development of the new programme, which builds its existing work in digital citizenship and mental health.
‘Empathy Online’ is a new, free, self-directed digital learning programme designed to help young people combat harm in online spaces by building a toolkit of empathy-based skills and promoting bystander intervention.
The programme aims to help young people avoid contributing to online harm and develop skills so they feel more confident in responding safely and proactively to harms they witness in digital spaces. It also offers valuable tools to understand how algorithms and clickbait can attempt to alter our behaviour and how to respond to online harm and support others who are experiencing it.
We need initiatives like ‘Empathy Online’ to provide young people with media literacy skills to navigate the online world. Much of our media literacy efforts have focused on internet safety, and we must extend this to provide young people with an understanding of the harmful psychological hacks of the online world and ways to reduce the inevitable emotional fallout from this activity.
The online world is becoming increasingly integrated into our lives. With the potential impact of generative AI (artificial intelligence) and the Metaverse coming downstream, we must provide the necessary support for young people to navigate the digital world. Delaying phone ownership until a child is old enough to manage the capacity of a smart device and providing smartphone pouches in secondary schools are helpful steps.
We also need to educate and support children and parents so that they can better understand the psychological forces underlying the online world and be more aware of the dangers of occupying that space.
The casino-type strategies that online platforms use to hack our attention need to be explained and exposed so that young people know how they are being manipulated. Far too much control and influence has been placed in the hands of software engineers, allowing them free reign over what billions of people see and, consequently, how they feel.
Dr Colman Noctor is a child psychotherapist