twitter
youtube
instagram
facebook
telegram
apple store
play market
night_theme
ru
search
WHAT ARE YOU LOOKING FOR ?






Any use of materials is allowed only if there is a hyperlink to Caliber.az
Caliber.az © 2026. .

United States–Israel vs Iran: LIVE

WORLD
A+
A-

BBC: Social media algorithms "prioritised outrage over safety"

17 March 2026 01:24

Internal research at Meta and TikTok revealed how social media platforms amplified harmful content to boost engagement, whistleblowers told the BBC, raising fresh concerns over the safety of millions of users.

More than a dozen insiders said the companies knowingly allowed content that included misogyny, conspiracy theories, and other borderline harmful material into users’ feeds, prioritising profits and growth over wellbeing. An engineer at Meta, which owns Facebook and Instagram, said he had been told by senior management to allow more “borderline” harmful content in users’ feeds to compete with TikTok.

“They sort of told us that it's because the stock price is down,” he said.

TikTok employees also gave the BBC rare access to internal dashboards showing how user complaints were handled. One trust and safety team member, identified as Nick, said management instructed staff to prioritise political cases over harmful posts affecting children.

“If you're feeling guilty on a daily basis because of what you're instructed to do, at some point you can decide, should I say something?” he said. Nick advised parents: “Delete it, keep them as far away as possible from the app for as long as possible.”

The whistleblowers’ testimony is featured in the BBC documentary Inside the Rage Machine, offering a detailed look at how social media companies responded to TikTok’s explosive growth. Matt Motyl, a senior researcher at Meta, said Instagram Reels was launched in 2020 without sufficient safeguards.

Internal research shared with the BBC found that Reels had “significantly higher prevalence of bullying and harassment, hate speech, and violence or incitement than elsewhere on Instagram.”

Motyl described a “common trade-off between protecting people from harmful content and engagement,” adding: “Meta's products are used by north of three billion people and the more time they can keep you on there, the more ads they sell, the more money they make. But it's very important that they get this stuff right, because when they don't, really bad things happen.”

Internal documents seen by the BBC revealed that Facebook was aware its algorithms promoted content that angered users and drove engagement.

“Given the disproportionate engagement, our algorithms presume that users like that content and want more of it,” one study said.

Another internal analysis warned that the algorithm offered content creators a “path that maximizes profits at the expense of their audience's wellbeing” and that “the current set of financial incentives our algorithms create does not appear to be aligned with our mission.”

TikTok’s Ruofan Ding, a former machine-learning engineer, described the algorithm as a “black box” and said content was treated as “just an ID, a different number,” with safety teams responsible for removing harmful posts. But as TikTok adjusted its recommendation system almost weekly to boost engagement, Ding said he saw increasing amounts of “borderline” content.

The BBC also spoke to teenagers affected by these systems. One, Calum, now 19, said he had been “radicalised by algorithm” from the age of 14, consuming content that fueled racist and misogynistic views. UK counter-terror police report a “normalisation” of antisemitic, racist, violent, and far-right posts in recent months.

Meta and TikTok have denied wrongdoing.

Meta said: “Any suggestion that we deliberately amplify harmful content for financial gain is wrong,” while TikTok called the claims “fabricated” and highlighted its safety features for young users.

By Sabina Mammadli

Caliber.Az
Views: 93

share-lineLiked the story? Share it on social media!
print
copy link
Ссылка скопирована
youtube
Follow us on Youtube
Follow us on Youtube
WORLD
The most important world news
loading