‘Technology ensures that we’re all served our own personalised news cycle. As a result, we only get to hear the opinions that correspond to our own. The result is polarisation’. Or so the oft-heard theory goes. But in practice, it seems this isn’t really true, or at least not for the average Dutch person. However, according to communication scientist Judith Möller, the influence of filter bubbles, as they are known, could indeed be stronger when it comes to groups with radical opinions.
First of all, we need to differentiate between the so-called echo chamber and the filter bubble. As an individual, you voluntarily take your place in an echo chamber (such as in the form of a forum, or a Facebook or WhatsApp group), meaning you surround yourself with people who tend towards the same opinion as yourself. ‘Call it the modern form of compartmentalisation’, says communication scientist Judith Möller, who recently received a Veni grant for her research. ‘People have always had the tendency to surround themselves with like-minded people, and that’s no different on social media.’
Various news sources in parallel prevent a filter bubble
In the filter bubble, you are presented only with news and opinions that match you as an individual, on the basis of algorithms and without you being aware of this process. It’s said that this bubble is leading to the polarisation of society. Everyone is constantly exposed to ‘their own truth’, while other news gets filtered out. But Möller says that there is no evidence to support this, at least in the Netherlands. ‘We use various news sources in parallel – meaning not only Facebook and Twitter, but also radio, television and newspapers, so we run little risk of ending up in a filter bubble. Besides that: the amount of “news” on an average Facebook timeline is less than 5%. Moreover, it turns out that many people on social media are actually more likely to encounter news that they normally wouldn’t read or search out, so that’s almost a bubble in reverse.’
Bubbles at the fringes of the opinion spectrum
Nonetheless, a great deal of money is being invested in the use of algorithms and artificial intelligence, such as during election periods. Möller: ‘So there must be something in it. My theory is that filter bubbles do indeed exist, but that we’re looking for them in the wrong place. We shouldn’t look at the mainstream, but at groups with radical and/or divergent opinions who don’t fit into the “centre”. This is where we see the formation of ‘fringe bubbles’, as I call them – filters at the edges of the opinion spectrum.’
People with fringe opinions can suddenly become very visible
From spiral of silence to spiral of noise
As one example, the researcher cites the anti-vaccination movement. ‘Previously, this group was confronted with the ‘spiral of silence’: if you said in public, for instance to friends or family, that you were sceptical about vaccination, you wouldn’t get a positive response. And so, you’d keep quiet about it. But this group found each other on social media, and as a consequence of filter technology, the proponents of this view encountered the ‘spiral of noise’: suddenly it seems as if a huge number of people agree with you.’
The news value of radical and divergent opinions
And so, it can happen that people with fringe, radical or divergent opinions suddenly become very vocal and visible. ‘Then they become newsworthy, they appear in normal news media and hence are able to address a wider public. The fringe bubble shifts towards the centre. This has been the case with the anti-vaccination movement, the climate sceptics and the yellow vests, but it also happened with the group who opposed the Dutch Intelligence and Security Services Act – no-one was interested initially, but in the end, it became major news and it even resulted in a referendum.’
Consequences can be both positive and negative
‘In my research I aim to go in search of divergent opinions like these, and then I’ll try to determine how algorithms influence radical groups, to what extent filter bubbles exist and why groups with radical opinions ultimately manage, or don’t manage, to appear in news media.’
The consequences of these processes can be both positive and negative, believes Möller. ‘Some people claim that this attention leads people from the “centre” to feel attracted to the fringe areas of society, in turn leading to more extreme opinions and a reduction in social cohesion, which is certainly possible. On the other hand, this process also brings advantages: after all, in a democracy we also need to listen to minority opinions.’
Source: Do algorithms make us even more radical? – University of Amsterdam
To find out how researchers track the filter bubble, read about fbtrex here (pdf)
Personalisation algorithms and elections: Breaking free of the filter bubble
Beyond the filter bubble: Concepts, myths, evidence and issues for future debates
Filter bubbles in the Netherlands?
Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation, where algorithms personalise content for users without any deliberate user choice. We conclude that at present there is little empirical evidence that warrants any worries about filter bubbles.
Should We Worry about Filter Bubbles?
Pop the filter bubble: Exposure Diversity as a Design Principle for Search and Social Media
Michael Bang Peterson and a few others from the US have some interesting counterpoints to this.
Robin Edgar
Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft