Is The Algorithm The New Gatekeeper Of Content?

14/11/2016

Carat Sydney’s Head of Digital, Fiona Harrop, examines the flip side to our ever-increasing levels of content personalisation, and the role it has to play in in ‘attitude isolation’.

http://carat-cdn.azureedge.net/media/6870/fiona-harrop-bw-1260x840_square.jpg

The increased personalisation of content on the internet and our ability to harness this is becoming the holy grail of marketing.  As marketers, this personalisation has a huge number of benefits, driving more effective and efficient communications.

However there is another side to this personalisation that we have a duty as marketers to think about. The internet is perceived as an open democratic source of information, an enabler to diversity.  However, the reality is not as idealistic, as the use of algorithms by multiple platforms, means we are increasingly being steered towards topics that reflect our own ideologies.

This leaves a gap as we then aren’t being exposed to points of view that veer away from our own.  In marketing, we use this personalised approach to steer people towards products that we know they will either like or believe they will like due to life stage, demographic, behaviour and geographic location.

A number of terms have been coined for this personalisation, including ‘Social media bubble’, ‘filter bubble’ and ‘echo chamber’.

The term ‘echo chamber’ has been used by researchers from Boston University, describing it as “like-minded people who share controversial theories, biased views, and selective news.” The information is often repeated back and ends up being believed as fact.  The term is also now being applied in the social media space.

Facebook is one such ‘echo chamber’ that 1.13 billion (11 million daily AU) people peruse each day.  The Facebook algorithm means that we are seeing news in our feeds and views from people we follow, limiting our exposure to those with potentially similar view points.

One experience I had of this was with Brexit, where the majority of content in my news feed and opinion from people I followed on Twitter were all ‘vote remain’.  If my reading had been limited to social media platforms I would have been given a false perception that this was the majority viewpoint for the vote.  This was obviously far from the case with the final vote being to leave.

This insular exchange of information consumption and shared views is leading to ‘attitude polarisation’.  Cass Sunstein, a Harvard law professor has been studying these affects and highlighting how this narrow view of information in the social space is leading to polarisation of opinion.

Being around people with a similar attitude or within the social space where we are only reading similar opinions will just make that opinion more valid in your mindset.

Taking this a step further, could it be influencing people to become even more extreme in their views on a particular issue?

Is it easier to become more radicalised?

Polarisation has always been present but is new media speeding up the way this information is spread?

Looking at this from a political standpoint, Facebook likes and shares are what voters are now using to communicate their political views. These views are then reinforced as content skews towards these behaviors.

Removing the opportunity for them to be exposed to differing opinion and information, instead they continue to be served information backing their own beliefs.

In Jan 2016, 44 per cent of US adults reported having learned about the 2016 presidential election from a social media channel, outpacing both local and national print newspapers.  Following these candidates gave a view of the campaign through a narrow window.

CNN tagged Donald Trump as the first ‘social media president’.  He strategically gained traction through social media and syndicated news networks like Breitbart and Infowars rather than trying to gain ground with the media elite; a method now being heavily reviewed by politicians.

Editors were the original gatekeepers of content, however technology is now taking control in a more effective way.  If algorithms continue to curate in this way, our society is not only at risk of bias to particular views, and points of difference, but this continued personalisation of media is detrimental to diversity of thought.

We have a responsibility to ensure this evolves, so we get a balanced diet of content and information, not reinforcement of our own beliefs, and our social circle’s opinion and perspective.  As more channels become personalised we need to address this before we reach the tipping point of a homogenous society.

As Eli Pariser summarised “if algorithms are going to curate the world for us, then… we need to make sure that they also show us things that are uncomfortable or challenging or important”.

We have a stake in the way this plays out and have the ability to take control back from the new gatekeeper of diversity of thought.  We can actively seek other views and opinions by clicking on links that share diverse opinions, break out and follow new people in the social sphere and open up our news feeds to a wider array of topics, don’t limit yourself.

*This article originally appeared on B&T here

^Back to Top