Book Review: Shoshana Zuboff. The Age of Surveillance Capitalism. Profile Books 2019
I am writing this review in two parts.
The first part is my own relatively short summary of the argument of Zuboff’s very large book. This is what is included here. Most of the argument is Zuboff’s.
The second part is a supplementary text to explore the implications and impact of the bio-physical issues that underpin “surveillance capitalism” – its energy intensity, its carbon footprint, the fact that it is trying to sell a consumption lifestyle that is faltering because of resource constraints, pollution and debt and the health and environmental consequences of the radio-frequency electromagnetic fields. All of these are totally neglected by Zuboff but are important to the prospects for “surveillance capitalism”
Part One
Few books have had the scale of impact on me as this one. It took me 5 days to read and revealed a world that was completely unexpected and quite new. Zuboff is aware that what she writes is about something novel and unprecedented. It is unlike what has gone before. That is what makes us vulnerable. She uses the analogy of indigenous people’s in the Caribbean first meeting the Spaniards.
“When the Tainos of the pre-columbian Caribbean islands first laid eyes on the sweating bearded Spanish soldiers trudging across the sand in their brocade and armour, how could they possibly have recognised the meaning and portent of that moment? Unable to imagine their own destruction, they reckoned that these strange creatures were gods and welcomed them with intricate rituals of hospitality”.
In an analogy that will soon become clear we likewise imagine that a flood of computer and smart phone apps and social media platforms are there, free for our use, courtesy of Google, Facebook, Amazon and other Surveillance Capitalists, but do not realise how it comes about that we can get all these things for free. What are these companies selling that is bringing these companies billions?
The answer is that it is a mass of information about us that is on sale – information about our experiences, our lives and those of our friends, colleagues, associates and communities. We imagine we are searching with Google but do not see the extent to which Google is researching us – constructive models of our lives as “life patterns” for targeted marketing.
The lives that we thought of as private are being turned into data about us – like the questions we ask the search engines, like where we are and have been from mobile phone and geolocation data that we use. Then there is information that we machine translate. There is information about us from conversations “overheard” by “Cortana” and “Alexa”, snippets of conversation overheard by smart TVs – “please pass the salt, we’re out of laundry detergent, I’m pregnant, let’s buy a new car, we’re going to the movies now, I have a rare disease, she wants a divorce, he needs a new lunch box”. These snippets of communication overheard by smart machines are sent back to companies that lead in voice recognition to be sold on to third parties for whatever use they have for the information– including companies wanting to market to us…but also insurance companies that want to assess us for risk and the security forces of the state and political parties interested in us as political beings.
This book is full of the way things have already gone and are planned to go further. For example the American Journal of Medicine published a study in 2016 which looked into Android based diabetes apps – examining 211 apps and randomly sampling 65 of them for close analysis of data transmission practices.
The researchers identified a good deal that users were unaware was happening including apps that delete or modify information (64%); read phone status and identity ( 31%); gather location data (27%); view wi fi connections (12%); activate camera to access photos and videos (11%). Between 4 and 6% of the apps went even further: reading contact lists, calling phone numbers found in the device, modifying contacts, reading call logs, activating microphones to record speech. Of the 211 apps 81% did not have privacy policies and of them 76% shared sensitive information with third parties. Even among those who did have privacy policies 79% shared data. As Zuboff argues – what are called “privacy policies” should more appropriately be called “surveillance policies”.
One can go on and on. Take, for example, the technologies to create smart clothing with sensors woven into the fibres that can interpret body movements and gestures – thus enabling the interpretation of emotional reactions alongside the technologies for recognising faces and emotional expressions. The resulting “emotional analytics” is not made by the wearer for their own purposes but unseen organisations for surveillance and marketing purposes.
For years non human wild animals have had sensors and transmitters attached to them to see what they are up to and how they live. Unaware that they are being observed scientists get a genuine picture of the life and behaviour of animals in the wild. Now an analogous process is being done to us humans too. Just as it is important that non human animals are unaware that they are being observed – so too the study of humans is being done as much as possible below the level of their awareness.
All of this private human experience transformed into data is then simply declared to be the property of the covert observers. The business mode of the surveillance capitalists like Google and Facebook is thus based on stealing the rights of those whose private experience has been appropriated to decide for themselves what can be done with information that they thought was private to them. The aim for the surveillance capitalists is to share this theft of information about private lives with third parties for money.
Zuboff describes the aim as being to create and sell “prediction products” to the companies who want to market to you – or perhaps to your insurer, health service provider, or police and security services. You can then, for example, be targeted with what are predicted to be appropriate marketing messages for the right products, at the right time and in the right places.
Advertising is thus no longer generic but personalised and to you – indeed just at the time you are passing the shop with the geolocator switched on your smart phone. The smart in your smart phone is not a reference to how clever the phone is in serving your needs. It is smart in its services to the surveillance and marketing sector.
To make is “smarter” still the direction of development is to deploy technologies of behaviour modification – with an increasing range of types of influence available for “nudges” that are ideally not immediately obvious to you.
One such example is the use of mass games like PokemonGo. In this case the “game within the game” (the game that is not obvious to the ordinary players) is about delivering you to bars and restaurants which are part of the market being serviced. In other words the bars and restaurants are Pokemon destinations for people to be herded to with the covert agenda to increase the takings of these businesses for which they pay the surveillance capitalists.
The development of the Internet of Things, for example, in one’s own home, would or will provide more examples where a growing number of household objects are “smart”. This means that the household objects have sensors and transmitters connecting them with the organisations that have brought them to market. The transmitters will reporting back on how and when they are used and the aim is an increasing ability to actively engage with the individuals or households that have acquired them.
“We are learning how to write the music, and then we let the music make them dance”, an internet of things software developer explains, adding:
“We can engineer the context around a particular behaviour and force change that way. Context aware data allow us to tie together your emotions, your cognitive functions, your vital signs. Etcetera. We can know if you shouldn’t be driving and we can shut your car down. We can tell the fridge “Hey lock up because he should not be eating” or tell the TV to shut off and make you get some sleep, or the chair to start shaking because you shouldn’t be sitting so long, or the faucet to turn on because you need to drink some water.”
All of which to me sounds like living in an automated psychiatric ward in which it is not the psychiatric staff who know what is best for you but a variety of algorithms of ideal behaviour. In this respect reading Zuboffs text suggests that ideal behaviour would be perfectly predictable – with a set of algorithms to keep any anomalous behaviour in check. In this kind of world the insurance company will not have to worry about things because all the anomalies of ordinary living will arise as problems for you not for the company. An example is would be the car that switches off automatically to give the insurance company the certainty that you will never drive it dangerously.
What makes this so crackpot is that in real life anomalous events are inevitable and ubiquitous – but the working assumption is that surveillance capitalists and customers can automate life in such a way that certainy prevails just like in the operation of a computer programme. For example Zuboff takes to task Hal Varian a senior economist at Google who declares that it a lot easier for insurers when they are able to instruct the vehicular monitoring system not to allow an insured car to be started and to signal where it can be picked up in the types of new contractual systems possible with the internet of things. Zuboff responds:
“ A lot easier for whom?”….in Varian’s scenario, what happens to the driver? What if there is a child in the car? Or a blizzard? Or a train to catch? Or a day care centre drop off on the way to work? A mother on life support in a hospital still miles away? A son waiting to be picked up at school” (p 218 )
So far I have tried to give a thumb nail sketch of the practical issues described in this huge book but I have not covered many conceptual issues. For example underpinning Zuboff’s book is a philosophical analysis about the features of totalitarianism and what makes surveillance capitalism different from 20th century types of totalitarianism. She is at pains to describe this as “Big Other” rather than “Big Brother”.
In George Orwell’s description of 20th century totalitarianism Big Brother did not want abject submission but “conversion”, hearts and souls that have been captured and reshaped to passionately serve fascist or communist movements.
An antidote to this was written, after World War Two by behaviourist psychologist B F Skinner in his book Walden Two. For Skinner the new utopia would not use force – what it needed, according to Skinner, was adequate behavioural engineering. In Skinner’s lifetime the technology did not exist for this.
In today’s world the instruments are now to hand – in digital smart machines and “big data”. These instruments are owned by surveillance capital who are developing their tools out of the view of those on whom they are being imposed. Following in the footsteps of Skinner there are other academics with similar philosophies. Zuboff writes about the career and ideas of Alex Pentland as an example.
In Pentland’s Utopia getting everyone to co-operate requires “Social network incentives” which instrumentalise social pressure, especially among people with strong ties.
“….Pentland subscribes to the label Homo imitans to convey that it is mimicry, not empathy, and certainly not politics, which defines human existence. The term itself derives from studies of infant learning, but for Pentland it is a fitting explanation of all human behaviour all of the time: an assertion, like Skinner’s that control always rests with society. “The largest single factor driving adoption of new behaviours”, he writes, “is the behaviour of peers”. Because we are born to imitate one another, Pentland argues, the whole species is attuned to social pressure as an efficient means of behavioural modification.
“Like Skinner before him Pentland does not shrink from the idea that in the data driven society computation can reveal the truth of what is correct. A new social class of tuners exercises perpetual vigilance to cure human nature of its weaknesses by ensuring the population are tuned, herded and conditioned to to produce the most efficient behaviours.” “The tools of social network incentives” are all that are required to establish new norms of behaviour, rather than relying on regulatory penalties and market competition.” (p 438)
“In this “hive” regime individuality is unwanted as it creates friction that sucks energy away from “collaboration”, “harmony” and “integration”. According to Pentland “It is time that we dropped the fiction of individuals as the unit of rationality and recognised that our rationality is largely determined by the surrounding social fabric….” (quoted on page 439 )
Zuboff cites studies that show the effect of Facebook on the psychological development of adolescents to show where Pentland’s ideas in fact lead. For most of human history young people growing up in small communities would compare themselves with a small number of young people most like themselves with whom they were in direct contact. The comparison to a small number of peers would not be a threat to their mental well being.
After the introduction of TV and now social media it is very different. Although the contacts (“facebook friends”) are less likely to be real life encounters there is a huge intensity, density and pervasive of social comparison processes often focused on consumption events seen in photos or video clips. Young people strive to inflate their profiles in which biographical information, photos, and updates are crafted to appear even more marvellous in anticipation of the stakes for popularity, self worth and happiness. The result is a great deal of unhappiness and mental health problems in which self esteem is perceived as dependent on the reactions of others (indicated through the ‘like button” ).
In short using digital platforms whose own agenda is intent on selling and consumption a whole generation are having their psychological development screwed up. Pentlands “social pressure” takes the form of “I want to be like you” but the risks of difference and exclusion in the Facebook world of competitive consumption status displays threaten negative social consequences. This does indeed lead to an inability to develop a strong sense of individuality but not one that makes young people happier. Young people develop a consciousness of themselves as an “it” that others see – as a market object and status competitor.
What Zuboff is describing here is what has been identified by psychologists like Erich Fromm and, more recently, by Oliver James in his book on Affluenza. Facebook and internet platforms like it are mass incubators to turn small children into “marketing personalities” – insecure people who are materialistic, conformist, comparing themselves obsessively and enviously with others, publicising, promoting and marketing themselves….
Zuboff is indignant and outraged. It is inevitable that one is outraged by any form of totalitarianism. She counterposes another sense of what it means to be human. It is the will to will. That is, put in another way – the will to shape one’s own future according to interests that are genuinely one’s own and not given to you. Her writing of this huge book is an example of what she means. One must be the author of one’s own destiny.
And for this one must have the ability to be on one’s own – independent of what others think or believe one must have the moral courage to stick to one’s own plans and ideas however many “likes” there are – even when there are none. To add my own comment in support of Zuboff I am reminded of a comment by Jeffrey Masson, who wrote in a book called “Against Therapy” that “When we read almost any modern autobiography, we see that what was most painful was living in a reality that others did not see or would not acknowledge or did not care about.” And yet this is what real creativity can entail. If we are pre-occupied with “social pressure” what happens is that we abandon unpalatable truths for popularity – where popularity is this years narrative or designs created by and for the currently most powerful actors.
Part Two
The second part is a text designed, not so much as a criticism of Zuboff’s arguments but as a supplement to her text. This is because the issues that she describes have bio-physical dimensions but she has not described them – or perhaps even noticed them.
That is to say the digital surveillance system she describes is embodied and embedded in the operation of a physical infrastructure of computers, smart phones, mobile phone masts, data centres – all of which are consuming an increasing amount of energy, emitting a large volume of carbon emissions and generating an increasing amount of radio frequency electro magnetic fields.
These physical dimensions are energy and carbon intensive and contributors to climate change and energy depletion. They are also major sources of radio-frequency electromagnetic pollution with increasing evidence of health effects to plants, animals and people. To this should be added the effect that the increased consumption that the surveillance sector strives to motivate is an ultimately futile agenda – that is because the consumption sector is destined to contract as the energy to create consumption goods become more expensive and erratically intermittent when based on wind, solar and wave power. (The idea that smart technologies will be able stabilise intermittency is a fantasy and I will show why).
The developments that Zuboff so eloquently writes about are thus going to run up against not just mental health concerns and concerns about the violation of privacy, corporate arrogance and overreach – but also against physical health concerns and management issues as the “smart technology sector” runs up against the limited planetary carrying capacity manifested in crises of economic, public health, environmental and energy management. These intersections are the subject of a supplementary part two.