"People often criticize social media for facilitating the spread of misinformation. In this book, we define and describe the value of observed correction, which occurs when direct public corrections of misinformation are witnessed by others. We offer evidence that observed correction gives people a more accurate understanding of the topic, especially when they remember the corrections. We describe how lots of people - social media users, public health experts, and fact checkers among them - are conflicted or constrained correctors. They think correction is valuable and want to do it well, even as they raise real concerns about the risks and downsides of doing so. We demonstrate that simple messages addressing these concerns can make people more willing to respond to misinformation. Mitigating other concerns will require real changes to the very structure of social media and society, and for that reason we need everyone to work together to make observed correction more impactful. Experts should contributeby creating accessible curated evidence (ACE) to facilitate user corrections and correcting publicly to build social norms around responding to misinformation. Platforms should promote corrections and take action against toxic behaviors. Users should feel empowered to correct misinformation when they feel comfortable and confident. Fundamentally, observed correction is an important tool in the fight against misinformation because it is effective and can be scalable if more people are willing to do it"-- Provided by publisher.
While many solutions have been proposed to combat misinformation on social media, most are either ineffective, expensive, or do not work at scale. What if social media users could help mitigate the misinformation they're also responsible for proliferating?
In Observed Correction, Leticia Bode and Emily K. Vraga consider both the power of and the barriers to "observed correction"--users witnessing other users correct misinformation on social media. Bode and Vraga argue that when people view others directly and publicly correct misinformation on social media, their understanding of the topic becomes more accurate. Yet, while many members of the public value correction, Bode and Vraga find that they are often reluctant to correct misinformation they see on social media. This same reluctance to correct is seen among expert fact checkers and health communicators, compounded by the constraints of limited resources and competing priorities. To empower people to respond to misinformation, Bode and Vraga offer a set of practical recommendations for how observational correction can be implemented. In some cases, simple messages addressing concerns can increase users' willingness to respond to misinformation. In other cases, they argue that platforms will need to promote corrections and protect the correctors while experts can contribute by creating accessible curated evidence (ACE) to facilitate user corrections and build social norms around responding to misinformation.
Including analysis of eleven experiments, seven surveys, and dozens of interviews with social media users, health professionals, fact checkers, and platform employees about their efforts to curb misinformation online, Bode and Vraga make the case that observed correction is an effective and scalable tool in the fight against bad content on the Internet.
In Observed Correction, Leticia Bode and Emily K. Vraga consider both the power of and the barriers to "observed correction"--users witnessing other users correct misinformation on social media. Bode and Vraga argue that when people view others directly and publicly correcting misinformation on social media, their understanding of the topic becomes more accurate. Drawing on experiments, surveys, and interviews with social media users, health professionals, fact checkers, and platform employees, Bode and Vraga offer a set of practical recommendations for how observational correction can be implemented.