Whistleblowers told BBC that Meta and TikTok and Meta prioritised user engagement over safety, allowing harmful content on their platforms. The algorithm race led to increased harassment and ...
Social media giants made decisions which allowed more harmful content on people's feeds, after internal research into their ...
In a battle for attention, companies took risks with safety on issues including violence, sexual blackmail and terrorism.
Meta plans to test out X’s algorithm for Community Notes to crowdsource fact-checks that will appear across Facebook, Instagram, and Threads. In a blog, Meta said the testing in the US would begin ...
The deals are worth tens of billions each and strengthen each chipmaker's position with the hyperscaler.