We’ve all been that child, shaking our head from side to side when asked if we ate the chocolate off the Christmas tree whilst not realising that the chocolate smears across our face told another tale…
We don’t expect young children to be truthful, especially when they think they are going to get into trouble, but the ability to grow and learn when to admit we have made a mistake and accept the consequences, however harsh these may be is something that society needs.
Today is Safer Internet Day for 2025, when thoughts turn to all who have been affected by our digital society in a negative way and in particular those parents who will spend the rest of their life mourning the loss of a child as a result of their online experience.
Those of you who have been following our groups on the harms to children will know Ellen Roome’s campaigns and seen that last Friday, she and three other parents have filed a lawsuit against Tiktok and their parent company ByteDance in the USA. We have posted links to interviews with these parents, but today it seems appropriate to highlight one in particular:
Scroll to 9:00 minutes in where Hollie Dance explains that her son had 3 Tiktok accounts and yet Tiktok has told her that “nothing exists” despite the fact that his friends can still see these accounts.
Social media companies may bend the truth, mislead, or omit details for several reasons—mainly to protect their business interests. Here’s why they might lie or misrepresent the facts:
- To Protect Their Profits 💰
Ad revenue is tied to user numbers, so they downplay issues like bots, spam, or fake accounts. Hence as their business model relies on engagement—even from harmful or controversial content, they will be extremely unlikely to remove accounts unless they are forced. They may even claim that they are “cracking down” on problems while doing the bare minimum in order to keep advertisers happy and the tills turning over.
The carzy part is that who in their right minds thinks that the likes of those who follow the Instasluts are going to be influenced to buy designer brands or fine gourmet specialities grown by virgins in the foothills of the Swiss Alps? I suspect they are more Primari and Asda… - To Avoid Regulation & Fines ⚖️
Admitting to problems (e.g., harms to children, privacy violations, or failure to remove illegal content) could lead to government intervention or lawsuits. Some may even lobby against stricter laws whilst maintaining their PR claims to be supporting transparency and safety. - To Maintain Public Trust & Image 🏆
Social media companies want to appear responsible while still benefiting from engagement-driven algorithms which is vital to their profits. They frame their actions as “for the user’s benefit” (e.g., “We respect free speech” or “We promote community”), even when it’s about maximizing profits. Suggested accounts to follow may be based solely on advertisers pushing their own agendas (yes including their adult entertainment profiles). This can therefore result in them selectively enforce rules, often protecting content creators, influencers, big accounts, or advertisers while banning smaller users for minor infractions. - To Control the Narrative 📢
They use PR spin to downplay scandals—whether it’s data leaks, political bias, censorship, or failure to stop misinformation. More recently, Meta denied knowing Instagram was harmful to teen mental health—until internal research leaks proved otherwise. You can listen to an interview with one of the more recent whistle blowers via the following link:
https://youtu.be/CllzqHT-CNU
Yes life is never 100% safe, but in the real world society, we have some form of reassurance that there are measures in place to minimise our risk and decent people who are able to stand up for causes.
The internet needs policing in a way that removes harmful ideas, incorrect facts and any visuals that would not be allowed on our TV screens. We don’t need our kids to be given ideas that would never have occured to them if they had been left to their own devices; yes they will climb trees and annoy your neighbours by constantly kicking a football over the fence, but it’s extremly unlikely to be life threatening.
We don’t need an influencer telling us about the benefits of a beauty product they have never used, let alone researched to see if it actually has any proven trials. If someone wants to view porn, they know where to find it; we don’t need Instasluts to be suggested to us.
We are seeing increasing numbers of people becoming aware of the dangers of the internet to us all, but we need a lot of changes to be made from a willingness to remove all negative content from our screens, transparency on processes and an immediate response to harmful content.
And for the next billionaire wannabee, there is a niche for someone to create a social media platform that upholds standards, professional values and ethical content for its users.
Photo by Mediamodifier