Helping online users to recognize false scientific content by applying professional fact-checker techniques: that is the goal of a study conducted by a group of researchers from Vita-Salute San Raffaele University and published in Scientific Reports.
The research was coordinated by Carlo Martini, associate professor of philosophy and member of the Research Center in Experimental and Applied Epistemology (CRESA) directed by Professor Matteo Motterlini.
The study involved over 5,000 participants and the results obtained suggest some simple strategies that social media platforms could apply to help their users recognize scientific disinformation, such as a modest economic remuneration and a pop-up window reminding users to apply basic fact-checking techniques before sharing any content.
The three golden rules of fact-checking
The experiment conducted by UniSR researchers is part of an Horizon2020 project called PERITIA, whose goal is to better understand how the public defines and identifies "experts", thus discriminating between reliable and unreliable sources and helping to combat disinformation, especially online.
Explains Folco Panizza, first author of the study, conducted during his postdoctoral fellowship at Vita-Salute San Raffaele University:
We have chosen to focus our attention on one of the most studied approaches to combat online disinformation: fact-checking, a set of rules that allow us to verify the correctness and reliability of news reports.
There are three basic rules for checking online information: searching for the same news report in different websites, comparing them through what is called "lateral reading"; checking the source (its political and ideological identity, its interests); applying what is known as "click restraint": refraining from immediately clicking on the first results presented by a search engine during any online activity (including those relating to the first two points).
Unfortunately, despite their simplicity, fact-checking techniques are less known and applied than what it would be necessary. This is why it is essential to understand what could nudge users to use them more, thus helping to create a cleaner information ecosystem.
The online simulation experiments
To observe the natural behavior of users when encountering news content online, the researchers developed a highly faithful simulation of Facebook, which also reproduced similar graphics and interactivity.
Within this realistic simulation, participants were presented with a series of articles actually published online and concerning the most diverse scientific topics. Some of them were scientifically reliable and factual, others instead were false and misleading.
The users involved in the experiment - in total over 5000 UK citizens - were encouraged to apply fact-checking techniques in two ways: promising them an economic remuneration, even if symbolic, and reminding them how to check the accuracy of the information presented through a pop-up window. The experiment showed that both methodologies are effective and that the best result is obtained by combining them.
Scientific disinformation contributes to a climate of distrust between science and society, feeding controversies in fundamental issues such as vaccines, the adoption of measures to tackle climate change, health and social policies.
– says Carlo Martini, who in PERITIA leads the working group “Behavioral Tools for Building Trust” and who coordinated the study published in Scientific Reports.
This is why it is important to find new solutions to combat fake news and to educate users to correctly check the sources. In the current digital environment, only collectively sharing both accountability and control mechanisms will allow us to successfully reduce online disinformation.