First Brexit. Then Trump. Were they related? Were “dark forces” working to manipulate the population at large to achieve their sinister results? A few months ago, I would have thought this perspective quite paranoid. However, after following Carole Cadwalladr’s shocking and surprising investigative reporting into the hidden machinations of invisible algorithms, Google searches and social media manipulation – it doesn’t seem like such a paranoid fantasy after all.
Starting by looking at Google Cadwalladr exposed how its opaque algorithms were raising antisemitism and holocaust denial to the top of the front page when searching related terms in Google. Later, she drew attention to the “dark money” that threatens the integrity of our elections highlighting an electoral regulatory system that simply isn’t up to the task. Most frighteningly and comprehensively Cadwalladr’s feature on May 7th in The Observer The Great British Brexit Robbery: how our democracy was hijacked exposed a whole series of connected individuals backed by big money with the singular hope of turning things their way. Not least, Brexit backing millionaire Arron Banks and creepy alt right advisor to Trump, Steve Bannon. (Graphic below from The Observer).
I will have to leave the investigative reporting in the hands of the professionals like Cadwalladr (and thank goodness for her and the likes of the still free press). As a psychotherapist and researcher, I have naturally been interested how this operates on our psychologies. Can we really be manipulated so easily?
Much of this came to light with Motherboard’s article in January entitled The Data that Turned the World Upside Down, an article that spurred me to write a commentary in i-News entitled The Brexit and Trump Victories Were Won Using Psychological Warfare. Motherboard (reprinting and re-reporting the story published earlier by the Swiss magazine Das Magazin) took us step by step through the gory details of how much of ourselves we give away simply by “liking” a variety of things on Facebook.
Just 68 likes can predict your skin colour, sexual orientation, and political affiliation to an accuracy of between 85% and 95%.
70 likes paints as clear a picture of what your friends know about you: 150 likes what your parent knows about you: 300 likes what your partner knows about you. Note, these are just likes, not the content of your posts.
By correlating psychometric tests with people’s behaviour on social media sites like Facebook, researchers are able to “know you” or at least pinpoint you on a number of crucially personality attributes. In effect, they have made Facebook like a giant psychometric test and then use the data it generates to target you by manipulating algorithms, directing content (real or fake) and siloing you into very specific demographics in order to change your mind.
It’s fair to ask just how vulnerable we are to having all our minds changed in a direction we don’t want it to go. The scary thing is, nobody has to. As Cadwalladr pointed out, the Brexit vote was down to 600,000 – which represents just 1% of the population. Only that one percent needed to tip in order for things to shift. For the Trump election, only a few tens of thousands of votes really mattered in just a couple of states in play. Cadwalladr notes:
“It’s not a stretch to believe that a member of the global 1% found a way to influence this crucial 1% of British voters”
The way in which these “dark forces” hope to influence those on the fence is as fascinating as it is darkly manipulative. If you have a look at this video of Cambridge Analytica’s Alexander Nix, you can see how a precise understanding of an individual’s personality profile can enable a way straight into some of their core decision making areas. If you skip ahead to four minutes you can see how differently two different personality types are approached to achieve the same end. One angle appeals to fear and insecurity while the other to pride and tradition.
The degree of intentional manipulation deployed by these agencies is abhorrent notably because they are operating out of any agreement of consent – from the way the personal information is acquired in the first place right through how news stories or advertisements are delivered so precisely to these individuals.
We should be pretty worried. Mostly our worries should concern themselves with the “black box” algorithms that are used across social media and what is done with our personal data – particularly that which has the capacity to identify vulnerable parts of our personality. Secondly we should be worried about the purposeful use of this data by individuals and organisations in an effort to shift the public mind through disinformation and psychological warfare. We should be worried that the checks and balances that we have built to deal with such potential are well behind the curve when it comes to the current state of the media landscape.
We should be worried about the ways in which we are siloed into filter bubbles where we surround ourselves with the people and media outlets that we’re comfortable with and distrust information from the other side. This is now exacerbated both by innocent algorithms that show us what we want to see, alongside pernicious algorithms that are purposefully engineered to manipulate us into certain points of view. A manipulation that involved the creation of fake news to shift public opinion and sows mistrust into the basic information establishment at the same time.
The good news is that I believe they don’t.
Big data is interested in the what, not the why. It is interested in the many, not the individual.
Much of the recent news about how much vulnerable we are to being known by social media and other means arose from a scientifically paper intriguingly entitled Computer-based personality judgments are more accurate than those made by humans. Let’s just say the authors of this paper may have been more interested in marketing their paper than the veracity of this claim. First of all, while the computer judgements were indeed slightly better than the human ones (r=.56 for computer “guesses” and r= .49 for human “guesses”) this is statistically relevant but really not overly meaningful. This is particularly the case when we learn that what the humans and computers are guessing is simply on the basis of a 10 item self-report about where people lie on the five factor model psychometric test – a measure that when done properly has 300 items! The authors of this study note its limits in their own words:
“Nevertheless, human perceptions have the advantage of being flexible and able to capture many subconscious cues unavailable to machines. Because the Big Five personality traits only represent some aspects of human personality, human judgments might still be better at describing other traits that require subtle cognition or that are less evident in digital behavior. Our study is limited in that human judges could only describe the participants using a 10-item-long questionnaire on the Big Five traits. In reality, they might have more knowledge than what was assessed in the questionnaire.”
As of today – my conclusion is that human beings are far to complex to be “known” by computer algorithms. However, they do come into their own when it comes to large numbers of people, and more and more they are likely to be utilised in broad social brainwashing campaigns. I believe we have already seen this in the Trump and Brexit campaigns, which I believe would not have concluded they way they did without these creepy interventions.
This is a good question. Predictably I think a good solution to this is self knowledge – individuals who are self-aware and reflexive are less likely to fall hook line and sinker for social manipulation campaigns like these. However, we cannot rely on the entire population having spent enough time in psychoanalysis or meditating to be more immune to such psychological assaults. Our systems also have to be better prepared, the electoral board, for example, and our social networks have to take more responsibility too. As always, as Ms. Cadwalladr is doing, we must “follow the money” and hold people to account.
There’s also a big education job to do. There are inflection points along this system whereby we can intervene:
Self knowledge and reflexivity can help us become more solid in our responses to that which affects us from the outside. This is the first point – to have a better relationship between the conscious self and the unconscious. Secondly, we can be more thoughtful about what and how we engage with technology and others across that technology. The technology itself must get with the programme, and social media giants like Facebook have a lot of work to do. Lastly, the way in which we perceive and engage with others over technology has to be reassessed with a critical eye – choices have to be made to pop our filter bubbles, listen to other perspectives, and rid ourselves of “fake news” that enables irrational appeals to the darker parts of the self. So in short, a lot of work to be done.
I will be speaking on this topic at Stillpoint Spaces London on June 22nd. Join us to expand the conversation!
More from me in my book, The Psychodynamics of Social Networking: connected-up instantaneous culture and the self.
You may also wish to join my Author Page on Facebook to keep up with my latest blog posts, events, and news about psychology, social media, and technology.
– Aaron Balick