Susan Greenfield: Computers may be altering our brains

The cyber-world is offering an unprecedented environment; the brain may be adapting. We should try to foresee these changes, both positive and negative

Friday 12 August 2011 00:00 BST
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Not so long ago, the term "climate change" meant little to most people: now it is recognised by almost everyone as an umbrella term encompassing a wide variety of topics from carbon sequestration, to alternative energy, to water supplies. Some feel we're doomed, others that the problems are exaggerated, and still others that science can help. Far less acknowledged is an equally unprecedented phenomenon characterising the 21st century which, just like climate change, poses a diverse range of questions. "Mind change" is an appropriately neutral, umbrella concept encompassing the diverse issues of whether and how modern technologies may be changing the functional state of the human brain, both for good and bad.

Yet the mere mention of this possibility has provoked hostility, misrepresentation, and oversimplification of the case. When I suggested in a recent interview that possible connections between an obsessive cyber-life and a decline in empathy should be investigated, my comments drew a barrage of criticism. Most baffling, however, is a refusal to enter discussion. "She thinks that computers can rewire our brains without our permission", was one blogger's salvo, obviously unappreciative of how the brain, especially the human brain, interacts with the environment, while also implying a mind-brain dualism which has long been overtaken by neuroscience.

The wonderful thing about being born a human being is that although we are equipped with pretty much all the neurons we will ever have, it is the growth and connections between the brain cells which account for the growth of the brain after birth. We human beings don't run particularly fast, nor see particularly well, and we are not particularly strong compared to others in the animal kingdom: but we have the superlative talent to adapt to whatever environment we encounter. Hence we occupy more ecological niches than any other species on the planet.

The ability to personalise our brain in response to environment and individual experience is known as "plasticity". As we make our unique way through life, we develop our own particular perspectives due to the connections between our brain cells that are driven and shaped by our specific experiences: it is these connections which are dismantled in Alzheimer's disease, and which normally enable us to associate people, actions and objects within the sequence of episodes that amount to our life-story. Our brain is in constant two-way dialogue with the outside world, shaping and reshaping our neuronal unique configurations into a unique "mind".

The rationale behind mind change therefore runs as follows: the human brain will adapt to whatever environment impinges on it; the cyber-world of the 21st century is offering an unprecedented environment; therefore the brain may be adapting in unprecedented ways. We should try to foresee what these changes might be, both positive and negative: only then can we minimise the threats and harness the opportunities.

Rather than engage with this fascinating challenge, many, including some leading academics, prefer blanket denial, seeking recourse in the mantra: "there's no evidence". Yet first, the mere plasticity of the brain, well-established and widely documented, surely requires some consideration at least as "proof" that we are not as inviolate as we might presume from external inputs. Second, evidence does exist from a range of studies: see for example the summaries in Nicholas Carr's The Shallows, Richard Watson's Future Minds, as well as Sherry Turkle's Alone Together. Third, it seems that social trends could indeed be leaving their long-term mark on the brain: a recent review by Bavalier and colleagues in the journal Neuron discusses the possible links in violence, addiction, and attentional problems with prolonged time spent in the cyber world. Further reports of long-term effects are appearing: for example, a relationship between internet addiction and physical brain changes (Kai Yuan et al., PLoS One, 2011) and the decline in empathy over the last 30 years which has accelerated in the last decade (Sci Am Dec 2010).

Agreed, even a few swallows don't make a summer and few scientific papers are viewed unanimously as conclusive: it is normal practice to carry out more research, and for the interpretations to berevised as results accumulate. Disagreement is part of science, but flat refusal to debate is not. Part of the problem here is that the hypothesis, that of cyber-induced long-term changes in the brain, is not readily tractable to definitive litmus-testing. What kind of evidence might one hope for, in a short window of time, which could demonstrate conclusively long-term transformations in empathy, understanding, identity and risk-taking?

Here are just three examples of the many and diverse questions which I, as a 21st-century citizen, would like to see explored. Could sustained and often obsessive game-playing, in which actions have no consequences, enhance recklessness in real life? How can we convert the information provided by search engines into knowledge and understanding? (As Google's ex-CEO, Eric Schmidt remarked: "I worry that the level of interrupt, the sort of overwhelming rapidity of information... is in fact affecting cognition. It is affecting deeper thinking.") How can young people develop empathy if they conduct relationships via a medium which does not allow them the opportunity to gain full experience of eye contact, interpret voice tone or body language, and learn how and when to give and receive hugs? This is where experts in autistic spectrum disorders, and autistic-like behaviours, could really provide a valuable perspective.

This century is like no other in being dominated by powerful, all-pervasive technologies. What an irony if such technologies, whilst enabling us to live longer lives, at the same time diminished our individual human potential at the very time we had an unprecedented opportunity to express it. Alternatively, an era could dawn in which each individual human mind was stretched, stimulated, and fulfilled as never before. We need to draw on the collective expertise of scientific disciplines, educationalists, media, policy-makers and, above all, the general public. It would be a better use of time than internecine wrangling.

Baroness Greenfield is a senior research fellow in pharmacology at the University of Oxford

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in