As Congress once again considers legislation that would regulate how kids can use tech platforms, Meta is making a renewed push to advertise its safety features for teens.
Last week, the platform started running ads promoting Instagram Teen Accounts, a product the company introduced last fall. The update makes accounts belonging to users under the age of 18 private by default, requires users under the age of 16 to get parental approval to change their privacy settings, and blocks messages from accounts teens aren’t following on the platform.
In the 30-second spot, Meta tells viewers—presumably parents—that “you’ve always looked out for them, we’re here to do it with you,” over a scene of a mother watching her son cross the street. The ads are running on linear TV, podcasts, digital display ads, and on out-of-home inventory.
The campaign first started running in September, but was brought back last week to coincide with an announcement that Meta would begin partnering with schools to root out cyberbullying.
“Teen Accounts were designed with parents’ biggest concerns in mind and have built-in protections that limit who can contact teens and the content they see,” Meta spokesperson Liza Crenshaw told Marketing Brew in an email. “Our ad campaign aims to raise awareness of these changes and help parents understand how to manage their teens’ time on Instagram in a way that works best for their family.”
The campaign and the new safety features come as Meta has faced increased scrutiny for how teens engage with its platforms, especially Instagram. In 2021, whistleblower Frances Haugen, a former product manager at Facebook, leaked internal documents to the Wall Street Journal that showed that the company had prioritized user engagement over safety and knew that Instagram was especially harmful to teen girls.
Get marketing news you’ll actually want to read
Marketing Brew informs marketing pros of the latest on brand strategy, social media, and ad tech via our weekday newsletter, virtual events, marketing conferences, and digital guides.
Congress has repeatedly considered legislation that would regulate how children and teens use online platforms. This week, Punch Bowl News reported that Kentucky Rep. Brett Guthrie said the Kids Online Safety Act (KOSA) would pass this year. If passed, KOSA would require platforms to take “reasonable care” to mitigate harms to minors, like cyberbullying.
When asked for a statement about KOSA, Crenshaw directed Marketing Brew to a 2023 Medium post written by Antigone Davis, VP and global head of safety at Meta, which calls for “federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps.”
“The best way to help support parents and young people is a simple, industry-wide solution where all apps are held to the same, consistent standard,” Davis wrote.
Some child safety advocates claim Meta’s tools are too little, too late. Josh Golin, executive director of the children’s advocacy group Fairplay, said safeguards like Teen Accounts “could have and should have been implemented years ago.”
“Instagram’s so-called Teen Accounts and Meta’s massive publicity campaign to promote them are just a blatant attempt to stave off legislation,” Golin told us in an email. “The Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act will require CEOs like Mark Zuckerberg to ensure their platforms are safe and privacy-protective for young people at all times, not just when it’s politically expedient. Parents need KOSA and COPPA 2.0, not Instagram’s PR campaign.”
Read the full article here