How social media companies are preparing for misinformation after Election Day

Россия Новости Новости

How social media companies are preparing for misinformation after Election Day
Россия Последние новости,Россия Последние новости

False claims and threats about the 2024 election are expected to get worse after Election Day. How are social media companies preparing?

  • 📰 WBUR
  • ⏱ Reading Time:
  • 283 sec. here
  • 6 min. at publisher
  • 📊 Quality Score:
  • News: 116%
  • Publisher: 63%

Computer monitors and a laptop display the X, formerly known as Twitter, sign-in page, July 24, 2023, in Belgrade, Serbia. The secretaries of state from Michigan, Minnesota, New Mexico, Pennsylvania and Washington are urging Elon Musk to fix an AI chatbot on his social media platform X, saying in a letter Monday, August. 5, 2024, that it has spread election misinformation.

And what it looked like was that these groups, these actors were really trying to harness, take advantage of the mainstreaming of anti-government sentiment. That's come in part due to the prosecutions of January 6th rioters and the indictments of Donald Trump, and Facebook is really important to these groups.

CHAKRABARTI: Okay. So these are then attempts, whether nascent or not, to actually physically organize post the election, depending on the outcome. And Meta's defense when we spoke to him about this was, Oh, AP3 could mean anything. It doesn't necessarily mean something nefarious, but in one of those examples, the AP3 page they generated was for a shooting range in New Mexico that the group frequents.

OWEN: That's right. I think that especially now that Elon Musk is at the helm of X, formerly Twitter, which I also have trouble remembering, and he's campaigning for Trump, the guardrails are really off.

Have you found that there's reason to be concerned about the amplification of such misinformation and distrust in government, even in the moments after the election. And perhaps most concerningly in one memo that we viewed, analysts say that they are dealing with what they call a threat gap. Since extremists have moved a lot of their communications to encrypted platforms. They're still using Facebook and X, and that's really important for their reach and recruitment, but in terms of their internal chats, we're just not really seeing everything we once did, assuming that there is more to see, because they are using encrypted platforms.

And this is a ripe environment for the rapid and perhaps dangerous spread of misinformation about Trump this election. So what are social media companies doing about it? That's what we're going to find out now. And joining us is Yael Eisenstat. She's senior policy fellow at Cybersecurity for Democracy, a research center based out of New York University and Northeastern University.CHAKRABARTI: Okay.

11 of those 13 social media companies did not respond, two declined our interview request, that is Discord and TikTok did send us a statement, though, saying, quote: EISENSTAT: So I'm actually really surprised that so many companies ignored this request. TikTok, ironically, many will, who study these platforms actually believe TikTok does enforce elections integrity policies better than most of the companies. They may have other reasons to do, they're under scrutiny for other issues here in the U.S. But they do seem to have actually stricter policies, they don't allow political advertising, for example.

There's still a business incentive here to play all sides, in case, for whoever ends up winning the election, but I do think they have a broader responsibility to the public to tell us, despite everything you've heard, despite the fact that we have rolled back a lot of our policies. Let me just give you a really quick example, 2020 election denialism and election lies used to be prohibited on the major platforms for a time.

For as long as they've been around. That might be a harsh assessment, but it does not surprise me that at this critical point, they don't want to take any chances and actually talk about admitting that they have some kind of responsibility here, in terms of helping to ensure faith in the electoral process of the United States.

They're not going to want to speak too publicly about the fact that they are actively playing a role with some of their tools in fanning the flames and recommending content, in letting militias and people who are very openly talking about election denialism or organize on their platforms and that they haven't fixed that problem from four years ago.

But what it means is I was working on the political advertising integrity policies while the civic integrity team was working more on sort of the newsfeed and the organic content. So similar work, just different parts of the business. And that is one of the key questions. We've seen that they've had mass layoffs across the industry.

CHAKRABARTI: Well, Yael, hang on here for just a second because I want to bring Cynthia Miller-Idriss into the conversation now. She's founding director of the Polarization and Extremism Research and Innovation Lab, also known as PERIL, at American University, and is a professor at American's School of Public Affairs and School of Education.CYNTHIA MILLER-IDRISS: Thanks. Hi, Meghna.

So it becomes a whack a mole situation anyway. Instead of considering the possibility that they might also have an obligation to prevent people from being persuaded by that propaganda in the first place.

And I think that's a terrible mistake. And I think the social media companies are making the same mistake. I think that we've seen with, in the public health world, with things like diabetes or cardiac disease, that you can invest in public health approaches. Upstream 20 years earlier to teach people about the choices they could make for healthier eating habits or exercise habits that reduce their incidence of disease.

EISENSTAT: Yeah. I could not agree more that just focusing on content moderation, what to take up, what to leave, what to take down, what to leave up is, I agree, the lowest hanging fruit. I do think it's important, especially if you say you have rules around disinformation, around election lies or whatever your rules are, you should actually live up to those rules and that should be the bare minimum.

Мы обобщили эту новость, чтобы вы могли ее быстро прочитать.Если новость вам интересна, вы можете прочитать полный текст здесь Прочитайте больше:

WBUR /  🏆 274. in RU
 

Россия Последние новости, Россия Последние новости



Render Time: 2025-01-13 17:16:40