Did Facebook gas political polarization through the 2020 election? It’s difficult.

0
317
Did Facebook gas political polarization through the 2020 election? It’s difficult.


Did Facebook fuel political polarization during the 2020 election? It’s complicated.

Getty Images | Aurich Lawson

Over the final a number of years, there have been rising issues in regards to the affect of social media on fostering political polarization within the US, with vital implications for democracy. But it is unclear whether or not our on-line “echo chambers” are the driving issue behind that polarization or whether or not social media merely displays (and arguably amplifies) divisions that exist already. Several intervention methods have been proposed to scale back polarization and the unfold of misinformation on social media, but it surely’s equally unclear how efficient they might be at addressing the issue.

The US 2020 Facebook and Instagram Election Study is a joint collaboration between a bunch of impartial exterior teachers from a number of establishments and Meta, the mother or father firm of Facebook and Instagram. The mission is designed to discover these and different related questions in regards to the position of social media in democracy throughout the context of the 2020 US election. It’s additionally a primary when it comes to the diploma of transparency and independence that Meta has granted to educational researchers. Now we’ve the primary outcomes from this uncommon collaboration, detailed in 4 separate papers—the primary spherical of over a dozen research stemming from the mission.

Three of the papers had been revealed in a particular situation of the journal Science. The first paper investigated how publicity to political information content material on Facebook was segregated ideologically. The second paper delved into the results of a reverse chronological feed versus an algorithmic one. The third paper examined the results of publicity to reshared content material on Facebook. And the fourth paper, revealed in Nature, explored the extent to which social media “echo chambers” contribute to elevated polarization and hostility.

“We discover that algorithms are extraordinarily influential in folks’s on-platform experiences, and there may be vital ideological segregation in political information publicity,” Natalie Jomini Stroud of the University of Texas at Austin—co-academic analysis lead for the mission, together with New York University’s Joshua Tucker—mentioned throughout a press briefing. “We additionally discover that common proposals to vary social media algorithms didn’t sway political attitudes.”

Ideological segregation

Let’s begin with the query of whether or not or not Facebook permits extra ideological segregation in customers’ consumption of political information. Sandra Gonzalez-Bailon of the University of Pennsylvania and her co-authors appeared on the conduct of 208 million Facebook customers between September 2020 and February 2021. For privateness causes, they didn’t have a look at individual-level knowledge, per Gonzalez-Bailon, focusing solely on aggregated measures of viewers conduct and viewers composition. So the URLs they analyzed had been posted by customers greater than 100 occasions.

The outcomes: Conservatives and liberals do certainly see and have interaction with totally different units of political information—robust ideological separation. That segregation is much more pronounced when political information is posted by pages or teams versus people. “In different phrases, pages and teams contribute far more to segregation than customers,” mentioned Gonzalez-Bailon. Furthermore, politically conservative customers are far more segregated and are uncovered to far more data on Facebook than liberal customers; there have been much more political information URLs seen completely by conservatives in comparison with these seen completely by liberals.

Finally, the overwhelming majority of political information that Meta’s third-party fact-checker program rated as false was seen by conservatives, in comparison with liberals. That mentioned, these false rankings amounted to a mere 0.2 %, on common, of the total quantity of content material on Facebook. And political information basically accounts for simply 3 % of all posts shared on Facebook, so it is not even remotely the most well-liked sort of content material. “This segregation is the results of a posh interplay between algorithmic types of curation and social types of curation, and these suggestions loops are very tough to disentangle with observational knowledge,” mentioned Gonzalez-Bailon of the research’s findings.

LEAVE A REPLY

Please enter your comment!
Please enter your name here