Content notification on discussions of grooming, sexual exploitation, coercion, and suicide.
Discord has announced an assortment of new teen safety tools and features to its Family Center, which launched in 2023, primarily aimed at guardians. The news comes as the company is facing multiple lawsuits regarding teens who allegedly suffered grooming and sexual exploitation on the platform, as well as in Roblox.
In a blog post that went live yesterday, the company stated that the update is meant to help guardians stay informed and play a more active role in their teens’ online experiences, while also giving teens autonomy to shape their “digital environment.”
The features, which will roll out over the next week, allow guardians more visibility over weekly activity, including all purchases made, total minutes spent on voice and video calls, and the top five users and servers a teen most frequently messaged and called in the last week.
In addition, if a teen submits a report on another user or piece of content to Discord, they’ll have the option to notify their guardian. If they do, the guardian will receive an email with the notice, but this won’t include the details of the report. Similarly, while guardians can now filter who can DM a teen and whether sensitive content is filtered, the exact content of conversations isn’t shared.
Related:Kim Swift, former Portal lead developer, joins NEARstudios as COO
The new teen safety tools roll out amidst lawsuits pertaining to allegations of Discord being unsafe for underage users
The blog post includes bulleted lists aimed at both teens and guardians, with the former noting that “your voice matters” and that “privacy is protected.” For guardians, Discord advises they “play a more active role in safety,” to “nurture trust,” and “start better conversations” with teens. “We’re focused on making it easier for you to offer guidance, not surveillance,” the announcement reads.
Back in April, Roblox Corporation and Discord were sued over allegations that each company facilitated the sexual harassment and exploitation of a minor across both Roblox and Discord. The lawsuit was filed in California on behalf of an underage plaintiff referred to as ‘Jane Doe’, taking aim at both platforms for allegedly enabling an accused sexual predator to harass and exploit Doe during interactions on each.
The accused individual allegedly targeted the teen through two games within the Roblox ecosystem, sending her messages through the “whisper” function even though her Roblox account “clearly displayed” she was a minor. Over time, the plaintiff was convinced to share her Discord username with the accused predator, leading to conversations on the popular chat platform where she was then coerced into sending sexually explicit images. These events took place over the course of six months.
Related:Megabonk dev withdraws from The Game Awards over ‘unqualified’ nomination
Back then, Discord declined to directly comment on the case in a message sent via a spokesperson. “Discord is deeply committed to the safety of our users and we take decisive actions when we detect violations of our policies, including removing content, banning users, shutting down servers, and engaging with law enforcement,” the company said in a statement.
In September, via NBC news, the mother of a 15-year-old California boy who took his own life sued Roblox and Discord over his death, alleging the son was groomed and coerced to send explicit images on the apps.
The lawsuit was filed by Rebeca Dallas in San Francisco County Superior Court, accusing the companies of “recklessly and deceptively operating their business in a way that led to the sexual exploitation and suicide” of Ethan Dallas. The son started playing Roblox at the age of 9 with his parents’ approval and with parental controls in place. The alleged targeting by an “adult sex predator” who posed as a child in Roblox began when he was 12, attorneys for Rebecca Dalas said in a statement to NBC at the time.
Related:GDC survey indicates over half of U.S. game workers want to join a union
Conversations “gradually escalated to sexual topics and explicit exchanges,” and the son was allegedly encouraged to turn off parental controls and move conversations to Discord, the lawyers said. On Discord, the man “increasingly demanded explicit photographs and videos,” and threatened the son that he’d post or share the images. The son complied out of fear, the complaint says.
This lawsuit accuses Roblox and Discord of wrongful death, fraudulent concealment and misrepresentation, negligent misrepresentation, and strict liability. It argues that if Roblox and Discord had taken steps to screen users before allowing them to register on the apps, or implemented age and identity verification and other safety measures, the aforementioned events wouldn’t have transpired.
Regarding Discord specifically, the suit said the platform is “overflowing with sexually explicit images and videos involving children, including anime and child sex abuse material.”
A spokesperson told NBC that the platform is “deeply committed to safety,” and that it wouldn’t comment on legal matters. “We require all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies,” the spokesperson said. “We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organizations to improve online safety across the internet.”
While the new features and moderation tools are helpful on paper, Discord makes the clarification that communication between teens and guardians is a key element throughout the announcement. But as seen with regulations like the UK Online Safety Act, people will always try to bypass verification systems. Less than 24 hours after the act came into effect in July, PCGamer reported that some people were able to bypass Discord’s verification by using the face of Sam Porter Bridges in Death Stranding 2 in Photo Mode.