Psychologists successfully complete test with 951 test persons – Effective containment successful
In addition to the “Like” button, social media should add two more buttons to click to reduce the spread of misinformation. This is what Tali Sharot of the Max Planck UCL Centre for Computational Psychiatry and Ageing Research (https://www.mps-ucl-centre.mpg.de), based at University College London (https://www.ucl.ac.uk), is calling for. Experts from the Massachusetts Institute of Technology (https://www.mit.edu) provide support. The buttons should be called “trust” and “distrust”, they suggest.
Fighting fakes is insufficient
“In recent years, the prevalence of fake news has skyrocketed, contributing to the polarisation of the political sphere and influencing people’s beliefs on everything from vaccine safety to climate change to tolerance of diversity. Existing ways to combat this, such as flagging erroneous posts, have had limited impact,” says Sharot.
One reason misinformation spreads so quickly is that posts are rewarded with “likes” and “shares”, but there are no incentives to share only what is true. To test their thesis on the effect of additional buttons, Sharot and her team developed a simulated social media platform that 951 study participants used in six experiments.
On the platforms, users could “Like” and share messages, half of which were true and half of which were false. Other users could additionally respond with “trust” or “distrust”. Result: Test persons used the “Trust/Distrust” buttons more often than “Like”. In addition, users posted more true than false information to be rewarded with “trust” by recipients. They also shared more true than false posts.
New buttons easy to integrate
“Buttons that indicate the trustworthiness of info could be easily integrated into existing social media platforms. Our results suggest that this would result in less misinformation being spread without reducing user engagement. While it is difficult to predict how this would play out in the real world with a broader range of influences, given the serious risks of online misinformation, this could be a valuable addition to ongoing efforts to combat fake news,” says Sharot’s doctoral advisor Laura Globig.