Tech Coalition Launches “Lantern” Program to Combat Online Child Exploitation

 The Tech Coalition

In the ongoing fight against online child sexual exploitation and abuse (CSEA), The Tech Coalition, a consortium of tech companies, has unveiled a new initiative named Lantern. This program aims to facilitate the sharing of crucial “signals” among social media platforms to identify and address activities and accounts violating policies against CSEA.

Tech Coalition

Participating platforms in Lantern, including Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch, can share signals containing information such as email addresses, usernames, and keywords associated with policy-violating accounts or the buying and selling of child sexual abuse material (CSAM). Other platforms can then utilize these signals to review and take appropriate action against policy violations on their own networks.

It’s important to note that these signals don’t serve as definitive proof of abuse but rather act as valuable clues for further investigations and evidence for law enforcement. During a pilot program, Mega shared URLs via Lantern that Meta used to remove over 10,000 Facebook profiles, pages, and Instagram accounts.

The Tech Coalition emphasizes the collaborative nature of Lantern, recognizing that the complexity of child sexual abuse requires collective efforts. The signals shared through Lantern allow companies to piece together a fuller picture of harm and take coordinated action.

Following the evaluation by the initial group of participating companies in the “first phase” of Lantern, The Tech Coalition plans to expand its reach by inviting additional platforms to join the program. The coalition also commits to including Lantern in its annual transparency report, providing participating companies with recommendations on incorporating their involvement into their own reporting.

Despite differing opinions on balancing online privacy and tackling CSEA, there is shared concern about the proliferation of child abuse material, both authentic and deepfaked, online. In 2022, the National Center for Missing and Exploited Children received over 32 million reports of CSAM.

A recent survey by RAINN and YouGov revealed that 82% of parents believe the tech industry, especially social media companies, should take more significant steps to protect children from online sexual abuse and exploitation. This public sentiment has prompted action from lawmakers, with attorneys general from all 50 U.S. states and four territories urging Congress to address AI-enabled CSAM. Additionally, the European Union has proposed mandating tech companies to scan for CSAM while identifying and reporting grooming activities targeting children on their platforms.

 

Read more

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *