SAWT BEIRUT INTERNATIONAL

| 10 December 2024, Tuesday |

How social media platforms are ‘targeting’ children with dangerous content

Many administrations around the world are increasingly growing wary of the role played by Social media giants like Facebook, Instagram and Twitter, in dissemination of content on their platforms, especially in terms of “targeted content”.

Targeted content may be defined as content which is created especially for a niche audience in mind to elicit specific response. Easiest example? Perhaps the little advertisements you see every time on Instagram that weirdly advertise products you may be looking to buy on other portals. You become the target, and the purchase of the advertised product is the expected response. With each positive response to such target advertisements, the algorithms that pitch personalised content to users become refined and stronger. While adults may be able to successfully understand that they’re targets of a marketplace strategy, children may be inadvertently exposed to dangerous products and content.

A new research sheds light on the disturbing content targeting designed especially for children. According to researchers at “Revealing Reality”, who undertook the project, accounts of minors and children are being fed inappropriate material soon after joining any social media platform.

Researchers involved in the project set up social media accounts that would essentially resemble the online avatar of a child based on information from kids aged between 13-17. They took into account the kind of accounts generally followed by kids and the nature of content they’re expected to “like” across platforms.

What is “inappropriate material”, you wonder? In this case, it refers to images of self-harm including razors and cuts.

Just hours into the creation of accounts on social media, these fake accounts representing kids were fed content about diets and sexualisation. In addition, pornography and other explicit content was easily accessible through these accounts.

Worryingly, just hours after signing up, the accounts were approached by unknown adults, positing that a part of this targeted content attempted to link kids to adults on social media. This experiment was commissioned by 5Rights Foundation and the Children’s Commissioner for England, who are urging governments to formulate rules to regulate the design and models of online platforms.

Researchers believe that such content is dangerous especially for kids who are grappling with body image issues, for they’re constantly fed unrealistic ideas of an ideal body type.

A 14-year-old girl, Molly Russell took her own life after viewing graphic self-harm and suicidal content online. Her father, Ian Russell told SkyNews that social media companies prioritise profit over safety and is urging governments to bring online content in line with what kids are shown in the physical world.

Facebook, Instagram and TikTok were named explicitly in the report. In response, both Facebook (which also owns Instagram) and TikTok commissioned the same staple message, claiming they have robust models to ensure kids are safe on their platforms.

    Source:
  • The National News