In The Filter Bubble (Viking 2011), self-described progressive geek Eli Pariser has asked some pertinent questions about changes which have crept up on us while we frittered away the hours laughing at cats that look like Hitler and people falling over on YouTube. If anything, we are being subtly restricted in our access to other influences online. Most of us assume, for instance, that when we google a term we all see the same results. Yet since December 2009, Google has personalised our searches. Based on what they know about us from our digital footprint (which is far more than we could imagine with the use of fiendishly clever algorithms), Google will put at the top of our search those websites that their computing thinks we would be most interested in. In other words, there is no standard Google anymore. The site has gone post-modern, if you like, creating subjective rather than objective results. Pariser cites the BP oil spill in the Gulf of Mexico in 2010 where two friends googled BP, only to find that one opening results page had nothing about the oil spill while the other had nothing but links about the oil spill. And these friends were, culturally and politically, very similar people.
Welcome to the era of personalisation.
This subtle but profound change in the way Google presents information to us when we search for it is only one way in which the corporate world is finessing our online experience to market their products and presenting to us only the information their impersonal algorithms think we will be interested in. This can produce beneficent outcomes, saving us from trawling endlessly to get to what we want. Yet it has darker implications. The information which is most readily being washed up on the shore in front of us like detritus helps largely to reinforce existing identities, confirming us in our preferences and prejudices. We are less exposed to what may challenge and enlarge our understanding of the world. With today’s internet it is like we picked up a telescope for the first time and marvelled at what it could magnify, only to decide to look through the other end instead. This has worrying implications for how we are challenged in our ignorance – an imperative of the Gospel. It also suggests a future where people are confirmed in what they believe already rather than being stimulated by alternative ideas, which is a priority of a healthy democracy.
Allied to these concerns is the stunning development in what data marketers have pieced together about us from our digital lives, like aircraft investigators painstakingly re-building the shattered fragments of a crashed plane. The results are similarly disjointed because impersonal programmes lacking in human empathy and intuition are unable to do justice to the complexity of a human life. Yet this is how we are being silently judged. Every ‘Like’ on a Facebook page, every page lingered over on a Kindle book is being processed to build up a picture of who we are.
Some people take a relaxed view of this, because they feel they have nothing to hide. Others are frightened, sure that unseen forces will undermine their lives. Melvin Kranzberg, a professor of the history of technology has observed: ‘technology is neither good nor bad, nor is it neutral’. While digital utopians are too blasé about how personal information is processed and digital dystopians perhaps too paranoid, there are grounds for unease. The Stasi would have killed for this information in communist East Germany. Many other unreformed regimes are working very effectively to mine this trove. In the end we can only be comfortable with the omniscience of the God who Psalm 139 extols as the one who has ‘searched me and known me’.
We really have no idea what the digital revolution will eventually do with the information we thought was personal. While our media has been full of phone hacking, the silent deep sea trawl of who we are continues unremarked upon.