Meta and TikTok let harmful content rise after evidence outrage drove engagement, say whistleblowers
Social media giants made decisions which allowed more harmful content on people's feeds, after internal research into their algorithms showed how outrage fuelled engagement, whistleblowers told the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results