Facebook misinformation gets SIX TIMES more attention that factual posts, according to NYU researchers
- The study looked at thousands of right and left-leaning news publishers
- It found that those peddling fake news got far more traction than factual sources
- Conservative outlets are more likely to share misleading news, however
- Laura Edelson says 40 percent of far-right sources are misinformation
- She led the team of NYU and Université Grenoble Alpes researchers
- Edelson says she was cut off from Facebook data last month after the company claimed it could put them in violation of FTC rules
- The FTC denied this, suggesting Facebook may be trying to shield itself from criticism
- 31 percent of people get their COVID-19 news from Facebook, one group found
Far-left and far-right misinformation on Facebook gets six times more likes and shares than factual posts, a study of thousands of pages has revealed – after the social media giant banned the researchers from the platform.
The peer-reviewed study by researchers at New York University and the Université Grenoble Alpes in France combed through 2,551 Facebook pages from August 2020 to January 2021.
It found that misinformation on both sides of the political spectrum spreads faster than facts from authoritative sources like the World Health Organization and corporate news outlets like CNN.
The study was released after Facebook shut down the personal accounts of the researchers, who were looking into data for a different study about political ads before the most recent one was released.
Facebook claimed that the researchers were ‘using unauthorized means to access and collect data’ in violation of their terms and a 2019 data-privacy settlement with the Federal Trade Commission.
But Samuel Levine, acting director of the FTC´s consumer protection bureau, hit back saying a consent decree allows Facebook to create exceptions to data collection restrictions ‘for good-faith research in the public interest.’
The academics say the company is attempting to exert control on research that paints it in a negative light.
The NYU researchers found that 40 percent of far-right sources and 10 percent of center or left-leaning sources promote misinformation.
They also said misinformation accounts for 68 percent of engagement with far-right sources, compared to just 36 percent for far-left sources.
All fake news, left or right-leaning, spreads at six times the rate of facts, a new study says
NYU researcher Laura Edelson says Facebook previously cut off access to an account she was using for a different study on political ads, claiming her data mining violated FTC rules
The study is likely to fuel claims that Facebook has widened the political divide in the US by reinforcing users’ pre-existing views and sectioning them off into silos.
Company founder Mark Zuckerberg has appeared before Congress numerous times, testifying on issues of privacy and abuse of data.
The research ‘helps add to the growing body of evidence that, despite a variety of mitigation efforts, misinformation has found a comfortable home — and an engaged audience — on Facebook,’ Rebekah Tromble told the Washington Post.
Tromble is director of the Institute for Data, Democracy and Politics at George Washington University, who reviewed the NYU study’s findings.
The study divided news publishers by their political leaning based on information from NewsGuard and Media Bias/Fact Check.
It found that posts on Facebook pages for left-leaning sites like Occupy Democrats and right-leaning sources like Dan Bongino and Breitbart are equally as likely to travel farther than posts from more centrist sources.
‘What we do find is that these ecosystems are just fundamentally different,’ NYU researcher Laura Edelson told CNN. ‘The far-right media ecosystem has a much higher share of sources of misinformation – 40 percent in fact.’
The findings have huge implications during the COVID-19 pandemic. Almost a third of people, 31 percent, get their news about coronavirus from Facebook, according to the Covid States Project.
In response to criticism over fake news during the 2016 election, Facebook modified its news feed to feature more posts from family and friends and less from pages and companies.
Facebook CEO and founder Mark Zuckerberg has testified in front of Congress on issues of data privacy and abuse of data. The company cut off access to the NYU researchers in August
In July, President Joe Biden said that social media companies that allow the spread of misinformation are ‘killing people.’
Facebook disputes the findings, claiming that pages, which are public and can be ‘liked’ by many users, represent a small part of activity on the social media platform.
‘This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook,’ said Facebook spokesman Joe Osborne.
‘When you look at the content that gets the most reach across Facebook, it is not at all like what this study suggests.’
He said the company has 80 fact checkers covering over 60 languages to label and reduce the spread of fake news.
Edelson, who conducted the misinformation study, was cut off from Facebook last month after the company argued that her data collection for a different study on misinformation in political ads could put them in violation of a 2019 US Federal Trade Commission privacy settlement.
Donald Trump’s candidacy, and later his presidency, presented problems for Facebook’s leadership, which debated whether to remove a video of him calling for a ban on Muslims
‘This latest action by Facebook to cut off an outside group’s transparency efforts – efforts that have repeatedly facilitated revelations of ads violating Facebook’s terms of service, ads for frauds and predatory financial schemes, and political ads that were improperly omitted from Facebook’s lackluster Ad Library – is deeply concerning,’ said Senator Mark Warner, a Democrat from Virginia, in a statement.
The FTC denied Facebook’s claim.
‘The FTC received no notice that Facebook would be publicly invoking our consent decree to justify terminating academic research earlier this week,’ wrote Sam Levine, the FTC’s acting director of the Bureau of Consumer Protection, in a letter to Facebook.
‘While it is not our role to resolve individual disputes between Facebook and third parties, we hope that the company is not invoking privacy – much less the FTC consent order – as a pretext to advance other aims,’ Levine wrote.
Conservatives have often complained that big social media platforms have silenced their views. During the 2016 election, Facebook seriously considered removing then-candidate Donald Trump’s video calling for a Muslim ban, according to the New York Times.
After Trump carried out a stunning victory, Facebook’s security engineers presented a study on how fake news spread so rapidly in the platform.
Facebook’s Vice President of Global Public Policy Joel Kaplan argued that shutting down the pages would disproportionally impact conservatives, according to the Washington Post.
Trump was eventually banned from both Facebook and Twitter after he called on his supporters to march to the US Capitol on January 6, where they broke into the building in a riot that killed five people.
Source: Read Full Article