Published Mar 29 2018

Facebook, Cambridge Analytica and commercialising humanity

With close to two billion users worldwide, Facebook is one of the most common experiences that humans can share. Why has it become such an integral part of so many people’s lives in such a short time (since 2004)?

Firstly, it taps into what makes us human – our inherent sociality – by enabling us to keep in touch with friends, acquaintances and social circles that matter to us.

Secondly, the advent of the smartphone (since about 2007) has made social media mobile – accessible at almost any time, ubiquitous in time and space.

An intrinsic part of our sociality is being aware of what others are doing and how it can affect us. Who is up and who is down? What can we do to attract valorising attention, and what should we do to avoid unwanted attention? How do others around us relate to us and to others, and what behaviour is most likely to benefit our own interests?

In effect, being social opens us up to surveillance, and this is where Facebook excels. Through our own desire to see and be seen, it watches us and uses the resulting stream of data points to sell targeted advertising to clients worldwide.

Facebook is brokering our sociality and commercialising our humanity.

Just delete it?

What can we do about this, can we just delete our accounts and opt out? Well, yes, this is an option available to all users.

However, in practice many would feel socially excluded, missing out on events and shared experiences among those that matter to them – this is a strong card that Facebook holds.

Furthermore, businesses and organisations realise that customers and stakeholders expect them to have a Facebook account and – especially for smaller groups and businesses – Facebook may be vital to their operations.

Essentially, Facebook has us in a bind.

In addition, this delete option may be a privilege easier used by richer people in richer countries. In some developing countries, Facebook’s ‘Free Basics’ partnership with telecoms corporations means that the cheapest mobile phone packages include unlimited Facebook data and other selected websites (for instance, Wikipedia).

For some, Facebook is the internet – surveys in countries such as Indonesia and Nigeria a few years ago showed more people saying they have used Facebook than saying they can access the internet.

Just regulate?

If we can’t stop using Facebook, maybe we can stop ‘bad actors’ using it?

Facebook’s main defence is that it does not provide content, just a conduit. But this is somewhat disingenuous given its ability to control the content we see. In declaring its new mission to be giving “people the power to build community and bring the world closer together,” Mark Zuckerberg also explained that it worked on artificial intelligence adaptations that resulted in a 50 per cent increase in people joining “meaningful communities”.

Cambridge Analytica is a political data firm at the centre of a political storm right now, accused of harvesting raw data from 50 million Facebook users without their permission. Its apparent ability to use the profile information and psychometric data to target specific groups with suggestive advertising leverages the same datasets in similar ways.

But while Facebook’s aim of connecting users in groups that are meaningful to them may be laudable, the concerns with Cambridge Analytica are its mercenary motivations and the potential impact on democratic processes.

The immediate solution may be to censure Cambridge Analytica and prevent its parent company, SCL Group, and its proxies, from using Facebook, but what guarantee do we have of Facebook’s motivations in the future?

Facebook’s institutional priorities are to its shareholders, and what would prevent it from deciding to offer ‘mass influencing’ services to clients in the future?

Same issue, different time

The questions faced now echo those that were raised regarding the influence of various mass media in the 20th century.

The solution then was mostly to adhere to the convention of the ‘right to reply’ and using public broadcasting to enable some form of legislated level playing field for access to the media by different political parties.

But these depended on the centralised nature of relatively few mass media outlets and distribution channels. The challenge of participatory networked media, where all users can produce content, raises new questions that cannot be addressed by these solutions.

Rather, the solutions lie in understanding and legislating the instructions that animate the quasi-neural networks of data points contained in the biggest databases of human behaviour in history.

These are the algorithms – and the next step will be to consider how to balance the rights of corporations concealing and exploiting these algorithms, and the rights of society to understand how they these corporations are filtering and influencing online interactions.

Julian Hopkins was a communications lecturer with Monash Malaysia until 2018.

About the Authors

  • Monash university

    Monash is one of Australia's leading universities and ranks among the world's top 100. We help change lives through research and education

    Monash academics are leaders in their fields and our research centres are tackling some of the world's biggest problems. Monash's teaching and learning community is one of the most vibrant in Australia. In addition, Monash has a collection of satellite campuses all over the world.

Other stories you might like