Abstract
In recent years, there has been widespread concern that misinformation on social media is damaging societies and democratic institutions. In response, social media platforms have announced actions to limit the spread of false content. We measure trends in the diffusion of content from 570 fake news websites and 10,240 fake news stories on Facebook and Twitter between January 2015 and July 2018. User interactions with false content rose steadily on both Facebook and Twitter through the end of 2016. Since then, however, interactions with false content have fallen sharply on Facebook while continuing to rise on Twitter, with the ratio of Facebook engagements to Twitter shares decreasing by 60 percent. In comparison, interactions with other news, business, or culture sites have followed similar trends on both platforms. Our results suggest that Facebook’s efforts to limit the diffusion of misinformation after the 2016 election may have had a meaningful impact.
Reference
Hunt Allcott, Matthew Gentzkow, and Chuan Yu, “Trends in the Diffusion of Misinformation on Social Media”, TNIT working paper, October 2018.
See also
Published in
TNIT working paper, October 2018