Mass surveillance is fundamental threat to human rights says European report (Harding, 2015).
Millions stolen as hackers hit banks (Yadron & Glazer, 2015).
Spyware and smartphones: how abusive men track their partners (Williams, 2015).
This short essay began as a series of reviews for the UK journal Futures. By the time I’d finished, however, it was too long for a reviews section and too uneven to qualify as an essay. So I’ve decided to publish a re-worked version here in three parts. The first takes the view that there’s more to ‘technology’ than a focus on mere ‘stuff.’ Equally, technical innovations are being actively ‘pushed’ and marketed to us, with little evidence of any ‘pull’ or real demand factors. It then considers two books that, for very different reasons, failed to impress: Big Data, and Who Owns the Future? These act as a prologue to sections two and three that explore more interesting and productive territory.
Headlines such as those given above suggest that all is not well in the exploding digital realm. In fact the idealistic hopes of early pioneers and freedom loving ‘netizens’ have largely dissipated along with anodyne visions such as the majestic ‘information superhighway.’ In place of these over-optimistic projections there’s a growing sense of uncertainty and even of disillusion. The reasons are not hard to find – but then neither are they particularly obvious. The digital realm is not easily grasped or understood. For most people it is a powerful but obscure realm – a kind of ‘nowhere’ or ‘shadow place’ – that lies somewhere beyond direct human sense or control. Yet those with privileged access appear to have almost unlimited influence both for good and ill. Unknown, intangible entities can reach out and destroy centrifuges in a distant country, disrupt civil infrastructure, threaten a Hollywood studio with bankruptcy and empty anyone’s bank account apparently at will. Women have been harmed by ex-partners who’ve tracked their movements, their conversations, via smart phones. And this leaves aside a host of phishing attempts, scams, identity theft and other on-line abuses.
From an everyday point of view the complex of technical arrangements called ‘the Internet’ now seems to offer both dangers and opportunities in equal measure. So what responses can or should be undertaken? Quite obviously there is huge and growing literature. So any such overview can only sample this vast, complex and evolving area of concern. Yet that’s surely better than not making the attempt at all since it’s clear that society has become enmeshed in a challenging situation that requires focused attention and a range of carefully crafted responses.
Technology – not merely ‘stuff’
One place to begin is with a key insight that emerged several decades ago from an STS (Science, Technology & Society) perspective – namely that is that it’s not helpful to think, speak or write about ‘technology’ as it were merely comprised of physical objects. It is, of course, the material existence of a technology that presents itself to our most obvious and external senses. But taken alone such a view reifies what ‘technology’ actually is – the product of long-term social, cultural and economic processes. Hence, many of the most significant characteristics of any particular technology are effectively invisible – both to the naked eye and the unprepared mind. They are not found by examining the ‘things’ that stand before us but by teasing out the patterns inherent in the causative relationships that brought them into being and maintain them over time. Thus to say anything of value about ‘the IT revolution’ or ‘the Internet’ suggests that we consider particular items, or suites of technology, in relation the wider contexts that produced them. That’s where the fun begins because as soon as you look ‘beneath the surface’ of social reality you find powerfully contested dynamics just about everywhere.
It’s no accident that the dozen or so powerful IT-based mega-corporations, whose economic power and reach exceeds that of many nations, all sprang from a very specific cultural milieu now known as Silicon Valley. As such they are emergent from a particular worldview and express the values contained within it. It’s here that a critique of technology can begin. It soon becomes clear that the present forms of neoliberal techno-capitalism embody certain inherent features that need to be acknowledged. They are essential to its operation despite being widely obscured, denied or minimised by promoters and beneficiaries. Nevertheless, such characteristics actively shape and condition everything that’s designed, marketed and sold. These hidden ‘drivers’ include the need to ‘free’ markets from effective oversight and government regulation, the pursuit of growth as an unquestioned goal, viewing the natural world as merely an set of resources for human use, promoting diminished views of human beings (as mere consumers or unthinking pawns) and, finally, the concentration of wealth into the hands of ever fewer individuals and groups. This constellation of values and beliefs helps to sustain an economic system fraught with danger and dysfunction that makes less sense with each passing year (Ehrlich & Ehrlich, 2013; Klein, 2014).
The crucial thing to note is that this particular worldview flourished over the very years when it became crystal clear that humanity needed to strike out in a completely different direction. We know this because the evidence is finally in that the present system is on a no-win collision course with humanity and, indeed, the planet itself (Higgs, 2014). It’s no longer possible to deny that the direction we should be collectively pursuing is one that moves decisively away from the diminished rationality of ‘the market’ and its impossible addiction to endless growth as defined during the industrial period. This is not to deny that genuinely innovative, useful and worthwhile uses of IT have emerged. Rather, the view here is that the ‘IT revolution’ has to a large extent been undermined and misdirected by corporatist ideology such that, instead of leading to a ‘better world’ it further inscribes our collective slide toward civilisational collapse and the Dystopian futures they imply (Floyd & Slaughter, 2014).
‘Push’ (not ‘pull’) with ambiguous results
Two other aspects of technical change need to be briefly mentioned. One is that new technologies are, on the whole, seldom sought by anyone representing the general public. Rather, ‘demand’ is created and imposed by these powerful organisations through pervasive and relentless marketing along with their sheer financial and economic power. One is reminded here of the aphorism credited to Donella Meadows who suggested that ‘you don’t have to spend millions of dollars advertising something unless its worth is in doubt.’ Few stand back to question the fact that corporate interests assume that they know what’s best for everyone. They are, for example, currently working to persuade us that an ‘Internet of things’ is a ‘really good idea.’ Yet, in the conditions outlined above, any new technology, or suite of them, cannot but be fundamentally ambiguous. So while they are introduced with positive – even showy – fanfares and the repeated enumeration of benefits there are always hidden dangers and unexpected costs. The latter tend to appear, however, through the social experience of using and applying the new means over time. Most parents of teenage children know very well what this means, as do the ‘lonely hearts’ who look for love on the Internet and end up losing their reputation, their savings or even their life. The Achilles Heel of the ‘internet of things’ is simply that it’s one thing to connect millions of devices together but another entirely to secure them.
In a sane and genuinely open world all new technologies would be subjected to rigorous questioning and testing before they were widely applied. Indeed, that was a central purpose of the US Office of Technology Assessment (OTA) that, in its brief lifetime, was established to advise Congress on exactly these matters (Blair, 2013). Nowadays the all-powerful ‘private sector’ in the US has comprehensively seen off this kind of initiative. Yet this has not occurred without cost. We can imagine, for example, what might have occurred if, instead of repealing the Glass-Steagal Act (to abolish the separation of high street backing and high-risk speculative gaming) the US government had put in place a high-powered group to investigate the implications of high-risk speculative credit-default swops and the like. The Global Financial Crisis (GFC) would certainly have been less serious or possibly averted altogether. But no such attempt was made. Warnings were ignored and taxpayers around the world ended up footing an outrageously expensive bill. While other attempts to institutionalise technology assessment have occurred in a few places, such arrangements unfortunately still remain uncommon (Schlove, 2010).
The upshot is that societies continue to be reinvented wholesale as waves of change (including those generated by new technologies) continue to impact upon them. The right questions about what this means and what needs to be done are neither being asked widely enough nor taken seriously by decision makers at any level. Consequently the resulting distortions and dangers of this ever more risky trajectory remain opaque and poorly appreciated by most people. The following review of recent sources in this and the following sections addresses the IT revolution considers how some thoughtful people have responded to these issues.
Big data, small vision
Mayer-Schonberger and Cukier’s book Big Data (Mayer-Schonberger & Cukier, 2013), is sub-titled ‘A revolution that will transform have we live, work and think.’ But the irony – to say nothing of the threat – in this escapes them entirely. The bulk of the book is devoted to arguing how ‘big data’ provides new insights into many otherwise elusive phenomena and in so doing creating new sources of value. The authors demolish some fantasies (for example that the emergence of IT can be equated with the ‘end of theory’) but concentrate on positive uses of big data. These include the ability to predict the emergence of epidemics and the prevention of aircraft breakdowns due to real time engine monitoring. But they consistently fail to separate what they consider to be ‘good for business’ from what may or may not be good for everyone else. Hence, the underlying theme, perhaps, can be summarised as ‘jump aboard or be left behind.’
While some limited acknowledgements are given of ways that previous long-standing occupations and professions have been undermined, the wider social and economic costs are overlooked. There’s a brief section on risks and strategies to minimise them. Yet no attention whatsoever is given to evaluating the culture and worldview from which these changes spring. Nor is there any attempt to consider or evaluate their future implications. Rather these powerful background factors are taken as given and hence remain invisible throughout. As such the book demonstrates a familiar preoccupation with how ‘technology’ will help us to ‘create the future’ along with a strong sense of blinkered optimism.
Lanier’s Who Owns The Future? (Lanier, 2012), is a very different matter. It is a sometimes brilliant, often idiosyncratic but finally a disappointing work, which I currently regard as a missed opportunity. As a long time inhabitant of Silicon Valley Lanier was involved in some the early stages of the IT revolution. Yet over time he became uncomfortable with the growing power of a few powerful actors and growing social inequality. He coined the resonant term ‘siren servers’ to draw attention to a humiliating new reality. That is, the process whereby everyone using the new systems is forced to yield personal information of inherent value in a one-way flow to those who own and operate them. So far; so good.
His solution, in part, was to establish what he calls the principle of ‘provenance.’ That is, to monetise flows of micro-value that would enable individuals to share in the new wealth. Yet this notion is put forward without carefully examining some serious drawbacks – such as the perverse incentives (e.g. ‘badging’, ‘nudging’ and ‘gamification’ – see Morozov 2013, in part three) that are already evident in this fast moving domain. Then, while there’s a good deal of knowledge and passion driving the book, the pro-technology bias vitiates what might have been a more penetrating analysis. The author’s close association with Silicon Valley is clearly evident, as is his inability to tease out some of the ideological ramifications. Such oversights made the book less helpful than it might otherwise have been and I had to put it aside several times. This was less due to inherent difficulty than to the fact that it was so idiosyncratic – brilliant one page, obscure the next. In the end, I was unable to finish it. It didn’t help that I was reading it as an eBook on a tablet. Despite considerable effort I found it impossible to sustain a ‘conversation’ with a book that I could not touch, inscribe or question as I went. Thus one of the costs of eBooks is the distance they create between the reader and the work. So this might be an item to re-visit in future by way of a hard copy edition!
Part two of this series looks at ‘reform and renewal’ and ‘the dark side’ (criminal uses of the Internet) and will appear in a couple of week’s time. Part three is called ‘interrogating net delusions.’ It focuses on Yvgeny Morozov and his masterly work To Save Everything Click here. Comments via email are welcome.
Blair, P, D. (2013), Congress’s Own Think Tank: Learning from the Legacy of the Office of Technology Assessment (1972-1995), New York: Palgrave Macmillan.
Ehrlich, P. & A. (2013), Can a collapse of global civilisation be avoided? Proceedings of the Royal Society, Biol Sciences, 280, 20122845, 9 January.
Floyd, J. & Slaughter, R. Eds’, (2014) Descent Pathways, Guest Editorial, Foresight, 16, 6, pp. 485-495. https://foresightinternational.com.au/wp-content/uploads/2015/04/FloydSlaughter-Descent-Pathways-editorial-Final_011014.pdf
Harding, L. (2015), Mass surveillance is fundamental threat to human rights, says European report, Guardian, 27th January.
Higgs, K. (2014), Collision Course: Endless Growth on a Finite Planet, MIT Press, Cambridge, Mass., USA.
Lanier, J. (2012), Who Owns the Future? Simon & Schuster, New York.
Mayer-Schonberger, V. & Cukier, K. (2013), Big Data: A Revolution That Will Transform How We Live, Work and Think, John Murray, London.
Morozov, E. (2013), To Save Everything Click Here, Penguin, London.
Sclove, R. (2010), Reinventing Technology Assessment: A 21st Century Model Washington, DC: Science and Technology Innovation Program, Woodrow Wilson International Center for Scholars.
Williams, R. (2015), Spyware and smartphones: how abusive men track their partners, Guardian, 26th January.
Yadron, D. & Glazer, E. (2015), Millions stolen as hackers hit banks, The Australian, 17th February.