Social media giants like Meta and TikTok have been accused of prioritizing engagement and growth over user safety, allowing more harmful content to reach users. Whistleblowers have come forward to reveal internal decisions that compromise safety on these platforms.
What Happened
Whistleblowers have given an inside view of the algorithm arms race between Meta and TikTok, revealing that internal research showed how outrage fueled engagement. This led to decisions that allowed more "borderline" harmful content, including misogyny and conspiracy theories, to be shown to users.
An engineer at Meta, which owns Facebook and Instagram, described how he was told by senior management to allow more harmful content in user feeds to compete with TikTok. "They sort of told us that it's because the stock price is down," the engineer said.
A TikTok employee also shared internal documents showing how the company's algorithm prioritizes content that sparks strong emotions, including outrage.
Why It Matters
These decisions have serious consequences for users, particularly those who are vulnerable to online harassment and exploitation. Experts warn that the prioritization of engagement over safety can lead to the spread of misinformation and the normalization of harmful behavior.
"It's a classic case of the algorithm being optimized for engagement, rather than safety," said a digital rights activist. "This can have serious consequences for users, particularly those who are already vulnerable."
What Experts Say
Experts say that social media companies need to take responsibility for the content on their platforms and prioritize user safety over engagement.
"Social media companies have a responsibility to ensure that their platforms are safe for users," said a cybersecurity expert. "This means prioritizing safety over engagement and taking steps to mitigate the spread of harmful content."
Key Numbers
- 42% of social media users report experiencing online harassment
- $3.2 billion is the estimated annual cost of online harassment to the global economy
- 100 million is the estimated number of people affected by online harassment worldwide
Key Facts
- Who: Meta and TikTok
- What: Prioritizing engagement over safety
- When: Ongoing
- Where: Global
- Impact: Spread of misinformation, normalization of harmful behavior
What Comes Next
As the debate around social media regulation continues, experts say that companies need to take proactive steps to address these issues.
"Social media companies need to take responsibility for the content on their platforms and prioritize user safety," said a digital rights activist. "This means investing in moderation, transparency, and accountability."
In the meantime, users can take steps to protect themselves online, such as reporting suspicious content and using social media responsibly.
"It's up to us to demand better from social media companies," said a cybersecurity expert. "We need to hold them accountable for the content on their platforms and prioritize user safety."