Companies allowed more harmful content on user’s feeds, knowing their algorithms ran on outrage, BBC hears.
Whistleblowers alleged social media giants including TikTok and Meta allowed harmful content to circulate on their platforms, BBC reported.
In a battle for attention, companies took risks with safety on issues including violence, sexual blackmail and terrorism.
Meta plans to test out X’s algorithm for Community Notes to crowdsource fact-checks that will appear across Facebook, Instagram, and Threads. In a blog, Meta said the testing in the US would begin ...
Meta has decided to let Threads users make custom tweaks to its all-important algorithm, but don't expect your preferences to stick and do expect to bring your best manners.… The social ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results