skip to primary navigationskip to content
 

Melisa Basol on Gates Cambridge: Inoculating Against Fake News

last modified Dec 11, 2018 10:27 AM

This is a Gates Cambridge article.

The original article can be found on https://www.gatescambridge.org/news/inoculating-people-against-fake-news

 

Inoculating people against fake news

Melisa Basol's research investigates how to counter the misinformation spread in fake news about immigrants.

The events of 2016 have prompted a lot of reflection around the role of fake news in the votes in the UK and US, particularly attitudes to and manipulation of attitudes to immigration. Melisa Basol [2018] was just finishing her undergraduate degree in Psychology at the University of Aberystwyth at the time. There she had become deeply interested in decision-making processes, including its heuristics and biases.

While considering what to do next, she came across a paper Dr Sander van der Linden on inoculating public opinion about misinformation against climate change. She started thinking about how effective inoculation theory might be in the context of Brexit and specifically, in relation to misinformation about immigration. She wrote to Dr van der Linden about her proposal and soon began her MPhil in Social and Developmental Psychology at the University of Cambridge.

For her MPhil she looked at attitudes towards immigration and whether these could be changed and people could be protected from misinformation that posited that immigrants posed an economic threat to the UK.

For her PhD, for which she received the Gates Cambridge Scholarship, Melisa has been drilling down into inoculation theory. It dates back to the 1960s and has its origins in Cold War fear of propaganda. The idea was that certain political views were like a virus and that the way to counter that and to build attitudinal resistance was to create a vaccine using a weakened form of the virus and enough information to protect a person from misinformation.

The fake news virus

Melisa says that it is very difficult to fight back against individual fake news articles once they have been released. “Fake news is like a virus,” she says. “It spreads fast and deeply. It jumps from host to host and is very difficult to dismantle once the ideas are out there. At the Social Decision-Making research lab, we have been working on something called pre-bunking. The idea is to equip individuals with the skills they need to detect misinformation."

As a member of the Social Decision-Making Research Lab at Cambridge’s Department of Psychology, she has been working on plans to generalise a ‘bad news’ game created by Jon Rozenbeek and Sander van der Linden. It is an online choice-based game that encourages the players to walk in the shoes of a “fake news tycoon”. Players learn about the six most common strategies used to produce and spread fake news.

The lab has just won a £50K WhatsApp Research Award for Social Science and Misinformation to have the game ready to use before next year’s Indian elections. They are trialling it on a demographically representative sample of people in India to make sure the intervention works.  The aim is to investigate the psychological mechanisms behind  the spread of misinformation on messaging applications such as WhatsApp in non-Western countries. The researchers hope to further explore the extent and longevity of attitudinal resistance conferred through choice-based  interventions.

The award has come about because Whatsapp is concerned about sharing of misinformation on its Indian platform, which has resulted in violent attacks, and has been looking to not only make it more difficult to do so, but to develop a model that can be trusted.

In addition to working on this project, Melisa is investigating cross-protection – the extent to which attitudinal resistance to one issue can be extended to linked topics - and she is investigating how inoculations against misinformation can be spread with the hope that, by outpacing the transmission of misinformation, “herd immunity” can be established.

Picture credit: Nick Saffell