nancially. Revelations from whistleblower Frances Haugen,investigative series and then presented in congressional testimony, show that the company was aware of the harm it was causing.
Growing concerns about misinformation, emotional manipulation and psychological harm came to a head this year when Haugen released internal company documents showing that the company’s own research conThe Conversation gathered four articles from our archives that delve into research that explains Meta’s problematic behavior.At the root of Meta’s harmfulness is its set of algorithms, the rules the company uses to choose what content you see.
One result is that low-quality information that gets an initial boost can garner more attention than it otherwise deserves. Worse, this dynamic can be gamed by people aiming to spread misinformation. “People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks,” Menczer wrote. “They have flooded the network to create the appearance that a conspiracy theory or a political candidate is popular, tricking both platform algorithms and people’s cognitive biases at once.”
(Read more: “Facebook whistleblower Frances Haugen testified that the company’s algorithms are dangerous — here’s how they can manipulate you”