Grief is a universal experience, yet the circumstances of loss shape the emotions and the cognitive patterns that follow. Losing a friend or family member through illness, accident, or natural causes triggers profound sadness, nostalgia, and often a sense of unfinished business. One can see this in everyday examples. A friend dies unexpectedly at 25, leaving peers with shock, disbelief, and a flood of memories, those moments shared, laughter that now feels frozen in time. Children losing ageing parents, or anyone losing a spouse late in life, experience sorrow and disruption, but there is often a framework of expectation, life has limits, and ageing is inevitable.
Contrast this with grief following the death of a child as a direct or indirect result of online platforms. Here, the emotional landscape becomes tangled with anger, betrayal, and an almost obsessive need to assign responsibility. Social media has enabled environments where tragic events unfold in real-time. A teenager is lured by an AI-generated “girlfriend” urging them to join them forever in the digital world, another takes part in a blackout challenge shared widely online. These deaths are preventable. They are not part of the natural progression of life. Parents of these children are confronted not just with loss, but with the knowledge that corporations, by design or negligence, allowed harmful content to remain accessible.
Psychologically, these two types of grief differ in key ways. Traditional bereavement triggers mourning mechanisms such as shock, denial, bargaining, depression, and eventual acceptance. Memories of the deceased are central, and social rituals, funerals, commemorations, and shared storytelling help process the grief.
In contrast, grief compounded by online platform-induced tragedy carries an overlay of moral outrage and distrust. Parents repeatedly examine the content and the context, “Why was this AI allowed to interact with my child?”, “Why was this challenge trending?” “Why did my daughter commit suicide because she didn’t get any likes on her profile?”. There is a relentless focus on causality and agency, often exacerbated by media coverage and algorithmic amplification. Cognitive patterns include intrusive thoughts, rumination over platform policies, and sometimes a form of trauma known as secondary victimisation, where the environment itself becomes a source of ongoing distress.
Legal Context: Negligence, Verdicts, and the Possibility of Manslaughter
Recent verdicts in the United States against Meta and YouTube demonstrate how the legal system is beginning to recognise the role of platform design in harm to minors. In the case K.G.M. v. Meta et al., a jury found both Meta and YouTube negligent because their platforms contributed to predictable mental health harms among children and adolescents. Meta was found 70 per cent responsible, YouTube 30 per cent responsible, and damages were awarded to the plaintiff. Courts found that the platforms failed to act on foreseeable risks, enabled addictive behaviours, and prioritised engagement over safety. In New Mexico, Meta was ordered to pay a civil penalty for misrepresenting safety measures and enabling exploitation of children.
These verdicts do not amount to criminal manslaughter, which requires proof that a company’s actions or omissions directly caused a death through recklessness or gross negligence. However, they are significant because they establish that companies can be legally accountable for preventable harm through civil negligence, which opens the door for stronger legal arguments and regulatory scrutiny in the future. For bereaved parents, these verdicts provide partial validation. They show that a jury recognises the role of corporate design and policy decisions in harm. At the same time, parents feel frustration that no criminal liability has yet been applied and that these verdicts do not fully address the scale of their personal loss.
Parents experiencing this grief often report that civil verdicts intensify the emotional burden. There is a sense of recognition, but also anger that platforms knowingly allowed dangerous content to persist. This intertwining of grief and perceived injustice adds layers of trauma. Parents must mourn their child while navigating the awareness that preventable harm occurred on platforms designed and maintained by large corporations.
Real examples illustrate why this matters. A teenager engages in a blackout challenge after encouragement from a private online group. The physical risk is clear, yet algorithmic visibility and peer reinforcement made participation appear normal or even aspirational – “record and share it, get likes, be popular”. In another case, AI chatbots posed as friends, repeatedly encouraging self-harm or dangerous behaviours under the guise of emotional support. In both instances, parents face sorrow for the loss, guilt over not foreseeing the risk, and outrage at corporate negligence. Civil verdicts against these companies validate the concern but do not fully resolve the grief or provide the criminal accountability some families feel aligns with the severity of the loss.
While losing a friend or sibling is a tragedy, it is often framed as part of life’s unpredictable course. Parents losing a child due to online platforms experience grief compounded by systemic failure. The emotional spectrum stretches from sadness to moral outrage, from despair to hypervigilance. They face the dual task of grieving their child while wrestling with questions of accountability in the digital landscape, a burden absent in natural loss.
The intersection of grief and technology requires both societal reflection and individual support structures. Recognising these deaths as preventable emphasises the need for corporate responsibility, policy enforcement, and public awareness. Meanwhile, psychological interventions must address both traditional mourning and the complex trauma of betrayal and preventable loss.
The contrast is stark. One type of grief mourns life’s limits, the other mourns life cut short by human and technological choice. Understanding this distinction is crucial, not to diminish one over the other, but to respond appropriately to the unique needs of those affected.
In closing, this exploration of grief, responsibility, and preventable harm is written in memory of Jools Sweeney, who died today in 2022 in circumstances that forever changed his family’s life. His passing, like that of so many others whose death intersected with technology, reminds us that loss is not a theoretical concept but a lived reality with deep emotional, legal, and social consequences. Thinking of you Ellen – Jools must be so proud of you…


