How people get sucked into misinformation rabbit holes – and how to get them out
As misinformation and radicalisation rise, it’s tempting to look for something to blame: the internet, social media personalities, sensationalised political campaigns, religion, or conspiracy theories. And once we’ve settled on a cause, solutions usually follow: do more fact-checking, regulate advertising, ban YouTubers deemed to have “gone too far”.
However, if these strategies were the whole answer, we should already be seeing a decrease in people being drawn into fringe communities and beliefs, and less misinformation in the online environment. We’re not.
In new research published in the Journal of Sociology, we and our colleagues found radicalisation is a process of increasingly intense stages, and only a small number of people progress to the point where they commit violent acts.
Our work shows the misinformation radicalisation process is a pathway driven by human emotions rather than the information itself – and this understanding may be a first step in finding solutions.
A feeling of control
We analysed dozens of public statements from newspapers and online in which former radicalised people described their experiences. We identified different levels of intensity in misinformation and its online communities, associated with common recurring behaviours.
In the early stages, we found people either encountered misinformation about an anxiety-inducing topic through algorithms or friends, or they went looking for an explanation for something that gave them a “bad feeling”.
*Research assistant, University of Technology Sydney
**Associate Professor in Behavioral Data Science, University of Technology Sydney
READ FULL ARTICLE (including the above text)
Source- The Conversation, February 23, 2024 11.02am AEDT (Under Creative Commons Licence)