Whistleblowers from Meta and TikTok have revealed that both companies prioritized engagement over user safety, allowing harmful content to proliferate on their platforms. A new BBC documentary, “Inside the Rage Machine,” highlights internal decisions that favored “borderline harmful content” to compete with rivals. TikTok employees reported being instructed to prioritize political content over serious cases involving child safety, while Meta’s algorithms were found to maximize profits at the expense of user wellbeing. Both companies have denied these allegations.
Want More Context? 🔎
Loading PerspectiveSplit analysis...