-
Ex-O.C. officer called woman thousands of times, tracked her location - 28 mins ago
-
Treasury secretary says Americans can already see refunds as Trump’s tax breaks kick in - 2 hours ago
-
How to Bet on the 2026 NBA Play-In Tournament and Unlock $2000+ in Bonuses - 6 hours ago
-
Catherine Bach glows in selfie after reported embolism health scare - 6 hours ago
-
Trump says Iran war “close to over” as Pakistan pushes for new peace talks - 6 hours ago
-
FBI searches Lancaster City Hall, homes of council members in corruption probe - 6 hours ago
-
S&P 500 hits new all-time high despite Iran war - 7 hours ago
-
Cowboys Predicted to Make Blockbuster Trade-Up for Ohio State Star in NFL Draft - 8 hours ago
-
Allbirds says it’s ditching footwear and pivoting to become an AI company. Its stock just jumped 600%. - 8 hours ago
-
Smart devices collect far more data than most users ever stop to realize - 9 hours ago
Instagram to alert parents when teens search for info on suicide or self-harm
Meta-owned Instagram will soon alert parents if their teenage child uses the app to search for content related to suicide or self-harm, the technology company’s latest effort to shore up safety features as it faces heat over how social media impacts young people.
Meta said that, starting next week, parents who use Instagram’s supervision tools will get a message — either via email, text or WhatsApp, as well as through an in-app notification — if a teen repeatedly searches for certain terms related to self-harm or suicide within a short time span.
The company said the message will inform parents that teens repeatedly searched for suicide or self-harm content and offer resources on how to approach sensitive conversations around mental health.
“The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support,” the company said Thursday in a news release.
Meta did not specify how many searches will prompt a parental alert, noting only that “we chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution.”
The new safeguard will initially roll out in the U.S., the United Kingdom, Australia and Canada before being deployed in other regions later this year, according to Meta.
In October of last year, Meta also introduced age-based content restrictions that block users under 18 from seeing search results for certain terms, such as “alcohol” or “gore.” At the time, Meta said it already shielded teens from search results related to suicide, self-harm and eating disorders.
Meta and YouTube trial
Meta’s new safety features come amid an ongoing trial in Los Angeles over whether its platforms, along with Alphabet-owned YouTube, are deliberately designed to addict young users. Meta CEO Mark Zuckerberg last week faced questioning about Instagram’s young users and Meta’s efforts to boost engagement.
Instagram specifies that users must be at least 13 years old to sign up for its app. At trial, however, Zuckerberg conceded that the rule is hard to enforce because users sometimes lie about their age. To verify users’ age, Instagram asks them to submit details such as their birthday, photo identification and a video.

