Today’s Solutions: January 12, 2025

BY THE OPTIMIST DAILY EDITORIAL TEAM

In an era where social media often sets the tone for beauty standards, TikTok’s decision to restrict beauty filters for under-18s signals a significant shift. Teenagers will soon be unable to use filters that artificially enlarge eyes, plump lips, or smooth skin, such as the popular “Bold Glamour” filter. This move, revealed at TikTok’s safety summit in Dublin, follows rising concern about how such filters affect mental health, particularly among young girls.

According to experts, the move is long overdue, as the polished images provided by these filters have led some teenagers to consider their true faces unattractive. “We’re seeing a rise in anxiety and falling self-esteem tied to the unrealistic beauty standards these filters promote,” TikTok acknowledged during the forum. While amusing effects such as bunny ears or dog noses remain unaffected, the platform believes that this step would relieve pressure on youth to conform to unrealistic standards.

However, the success of this endeavor is dependent on users disclosing their true ages—a substantial obstacle given that many teenagers avoid restrictions by fabricating their birthdates.

Strengthened age restrictions: a necessary move

In addition to the beauty filter limitations, TikTok is stepping up in its attempts to prevent users under the age of 13 from using the platform. By the end of the year, TikTok intends to test automatic machine-learning techniques meant to more effectively identify underage users.

Chloe Setter, TikTok’s lead on child safety public policy, said, “We’re hoping that this will give us the ability to detect and remove more and more quickly.” TikTok already eliminates an estimated 20 million underage accounts per quarter around the world, but regulators such as Ofcom remain wary of the platform’s ability to adequately enforce age limitations.

Between June 2022 and March 2023, roughly one percent of TikTok’s monthly UK user base was deleted for being underage. As new restrictions under the UK’s Online Safety Act take effect, platforms such as TikTok are under increased pressure to maintain compliance or face large fines. “It can obviously be annoying for some young people,” Setter said, referring to cases when users were incorrectly blocked, “but we’re taking a safety-first approach.”

A new era of online safety regulations

TikTok’s adjustments are part of a larger wave of safety precautions being implemented across social media platforms ahead of tougher laws. The UK’s Online Safety Act, which takes effect next summer, would require “highly effective” age verification measures for users under the age of 13. The stakes are enormous, with significant penalties for corporations that fail to meet compliance rules.

Other platforms are making similar changes. Roblox, a gaming platform with 90 million daily users, recently implemented limits for its youngest players, safeguarding children from violent and explicit content. Meanwhile, Instagram introduced “teen accounts” for under-18s, giving parents more control over their children’s actions, including overnight usage limits.

Andy Burrows, CEO of the Molly Rose Foundation, which focuses on suicide prevention, stated that these adjustments are not wholly altruistic. “It will not escape anyone’s attention that these shifts are being announced largely to comply with EU and UK regulation,” he added. “This makes the case for more ambitious regulation, not less.”

Calls for more accountability

While TikTok’s actions appear positive, critics contend that they fall short of solving the platform’s fundamental systemic concerns. Burrows has called for greater information about how TikTok’s age verification mechanisms would function and whether they will effectively lower the number of under-13s on the platform.

“TikTok should act quickly to fix the systemic weaknesses in its design that allows a torrent of harmful content to be algorithmically recommended to young people aged 13 or over,” said Mr. Cohen.

The NSPCC agreed, characterizing TikTok’s activities as “encouraging” but “just the tip of the iceberg.” Richard Collard, the NSPCC’s assistant head of policy for child safety online, emphasized the need for all social media sites to take action. “Ofcom and the government also have an important role to play in compelling tech bosses to deliver age-appropriate experiences for all their users,” Collard declared.

As the regulatory landscape develops, platforms must strike a balance between user experience and strong safety measures. While TikTok’s actions represent progress, they also illustrate the difficulty of protecting young people in the digital world.

Solutions News Source Print this article
More of Today's Solutions

White-tailed eagles return to southern England after 240-year hiatus

For centuries, there's been an eagle-shaped hole in the skies over England where the majestic white-tailed eagle once soared. The enormous raptor — its ...

Read More

Study: Drinking the right amount of caffeine may lower diabetes risks

While too much caffeine from coffee may cause unpleasant side effects such as anxiety or insomnia, that doesn’t mean you should cut your caffeine ...

Read More

Transforming migrant rhetoric is key in preventing genocides

The recent outbreak of war in Ukraine has forced many refugees to seek safety in countries throughout Europe. They are one part of a ...

Read More

These microbes could help honey bees thrive

As we like to remind our readers a lot at The Optimist Daily, honeybees are essential for our planet's ecosystem. Humans rely on these ...

Read More