Zappiens unreads

attentionAre our minds being dumbed by digits – or set free by unreading?

Frank Furedi notes  that it has become common to deplore a growing tendency to inattention. In fact, he says, this kind of complaint goes back to the eighteenth century. Very early on the failure to concentrate was treated as a moral failing rather than simple inability; Furedi links this with the idea that attention to proper authority is regarded as a duty, so that inattention amounts to disobedience or disrespect. What has changed more recently, he suggests, is that while inattention was originally regarded as an exceptional problem, it is now seen as our normal state, inevitable: an attitude that can lead to fatalism.

The advent of digital technology has surely affected our view. Since the turn of the century or earlier there have been warnings that constant use of computers, and especially of the Internet, would change the way our brains worked; would damage us intellectually if not morally. Various kinds of damage have been foreseen; shortened attention span, lack of memory, dependence on images, lack of concentration, failure of analytical skills and inability to pull the torrent of snippets into meaningful structures. ‘Digital natives’ might be fluent in social media and habituated to their own strange new world, but there was a price to pay. The emergence of Homo Zappiens has been presented as cause for concern, not celebration.

Equally there have been those who suggest that the warnings are overstated. It would, they say, actually be strange and somewhat disappointing if study habits remained exactly the same after the advent of an instant, universal reference tool; the brain would not be the highly plastic entity we know it to be if it didn’t change its behaviour when presented with the deep interactivity that computers offer; and really it’s time we stopped being surprised that changes in the behaviour of the mind show up as detectable physical changes in the brain.

In many respects, moreover, people are still the same, aren’t they? Nothing much has changed. More undergraduates than ever cope with what is still a pretty traditional education. Young people have not started to find the literature of the dark ages before the 1980s incomprehensible, have they? We may feel at times that contemporary films are dumbed down, but don’t we remember some outstandingly witless stuff from the 1970s and earlier? Furedi seems to doubt that all is well; in fact, he says, undergraduate courses are changing, and are under pressure to change more to accommodate the flighty habits of modern youth who apparently cannot be expected to read whole books. Academics are being urged to pre-digest their courses into sets of easy snippets.

Moreover, a very respectable recent survey of research found that some of the alleged negative effects are well evidenced.

 Growing up with Internet technologies, “Digital Natives” gravitate toward “shallow” information processing behaviors characterized by rapid attention shifting and reduced deliberations. They engage in increased multitasking behaviors that are linked to increased distractibility and poor executive control abilities. Digital natives also exhibit higher prevalence of Internet-related addictive behaviors that reflect altered reward-processing and self-control mechanisms.

So what are we to make of it all? Myself, I take the long view; not just looking back to the early 1700s but also glancing back several thousand years. The human brain has reshaped its modus operandi several times through the arrival of symbols and languages, but the most notable revolution  was surely the advent of reading. Our brains have not had time to evolve special capacities for the fairly recent skill of reading, yet it has become almost universal, regarded as a natural accomplishment almost as natural as walking. It is taken for granted in modern cities – which increasingly is where we all live – that everyone can read. Surely this achievement required a corresponding change in our ability to concentrate?

We are by nature inattentive animals; like all primates we cannot rest easy – as a well-fed lion would do – but have to keep looking for new stimuli to feed our oversized brains. Learning to read, though, and truly absorbing a text, requires steady focus on an essentially linear development of ideas. Now some will point out that even with a large tome, we can skip; inattentive modern students may highlight only the odd significant passage for re-reading as though Plato need really only have written fifty sentences; some courteously self-effacing modern authors tell you which chapters of their work you can ignore if you’re already expert on A, or don’t like formulae, or are only really interested in B. True; but to me those are just the exceptions that highlight the existence of the rule that proper  books require concentration.

No wonder then, that inattention first started to be seriously stigmatised in the eighteenth century, just when we were beginning to get serious about literacy; the same period when a whole new population of literate women became the readership that made the modern novel viable.

Might it not be that what is happening now is that new technology is simply returning us to our natural fidgety state, freed from the discipline of the long, fixed text? Now we can find whatever nugget of information we want without trawling through thousands of words; we can follow eccentric tracks through the intellectual realm like an excitable dog looking for rabbits. This may have its downside, but it has some promising upsides too: we save a lot of time, and we stand a far better chance of pulling together echoes and correspondences from unconnected matters, a kind of synergy which may sometimes be highly productive. Even those old lengthy tomes are now far more easily accessible than they ever were before. The truth is, we hardly know yet where instant unlimited access and high levels of interactivity will take us; but perhaps unreading, shedding some old habits, will be more a liberation than a limitation.

But now I have hit a thousand words, so I’d better shut up.

Footnote Consciousness

Picture: lol. I see that while I was away the Internet has been getting a certain amount of stick over the way it allegedly alters our mental processes for the worse. Some of this dialogue apparently stems from a two-year-old piece by Nicholas Carr, now developed into a book.  Most of the criticisms seem to be from people who have experienced two main problems: they’re finding that they have a reduced attention span, and they’re also suffering from a failing memory.  They attribute these problems to Internet use – but I wonder whether they have made sufficient allowance for the fact that both can also be the result of simple ageing.

I think it’s true that if you don’t use your memory, it gets worse, so it’s superficially plausible that relying on the Internet could have a bad effect: but I don’t think I find myself using the Internet for things I would otherwise have learnt by heart, while I certainly have begun forgetting things I knew quite well before the Internet was invented.  So far as my attention span is concerned, it has certainly waned steadily over the whole course of my life: when I was four or five I could spend a long time just examining the patterns made by the grain in a piece of wood (mind you, in those days, we had interesting wood, not like the bland stuff they produce these days…).  I regret this to some extent, but in another way I don’t regret it at all, because I think it is partly a result of mental improvement, the result of accumulated experience. I can tell more quickly now when something is not going to be worth pursuing, and I am less bothered about dropping it quickly. Nowadays, I don’t feel at all guilty about dropping a book after chapter one if reading it looks like a mistake, whereas twenty years ago I would no more have stopped reading a book once started than I would have got up and left a dinner party half-way through. I know now that life is too short.

But there are deeper criticisms of the malign effects of the Internet.  Jaron Lanier, in an NYT piece (he too has written a book about it; it’s interesting that both he and Carr, in spite of the alleged waning of attention spans, still thought this quixotic ‘book’ business was still worthwhile, rather than just tweeting their thoughts), suggests that we are increasingly deferring to computers, encouraged by inflated claims made for various pieces of software. This has a serious moral dimension – if we see computers as people, we may be led to see people as mere machines – but it also undermines original creativity, a point developed more in the book (OK, I skimmed a few summaries). We start to value mashups and compilations as more valuable than new work generated from scratch. Perhaps worst of all, we may end up letting stupid algorithms make actual decisions for us.

The first of these points is one that has been made before, and I believe it underlies many people’s aversion to the whole idea of AI.  I think it’s undeniable that software producers are gravely inclined to overstate what their programs do, speaking of relatively simple data manipulation as though it involved genuine understanding and originality. But I don’t think that has really devalued our conception of humanity – not yet, anyway. Unless and until someone produces a machine which they claim is a conscious being, that remains a danger rather than a current problem. I don’t think we’re really in danger of delegating important decisions either; letting a computer suggest a track or a book is akin to random browsing of shelves; Lanier himself notes that even the advocates of the computers don’t allow the machines to design their products or run their companies.

There’s certainly something in the point about creativity. Hypertext encourages quotation, and I suspect that this has had an influence:  an apposite quote is a frequent and respected way of contributing to discussions on popular forums and blogs, to an extent that would seem almost donnish if the quotes weren’t typically from Star Wars or the Simpsons rather than Shakespeare. It must surely be the case that sometimes on the Web people use text quotes, photoshopped images and so on when otherwise they would have chosen their own words or drawn their own pictures; but mostly the copied stuff is surely extra. It’s a bit like photography; when people could take photographs, they made fewer engravings and oil paintings, but mainly they made many more pictures (and let’s be honest – some of those uncreated paintings and engravings were no loss).

There are deeper issues still: has the Web influenced the way we perceive the world? I strongly suspect that films have to some degree influenced the way I see the world and represent my own life to myself. I can’t be the only person who has sometimes felt an irresistible urge to do a reaction shot for the benefit of a non-existent audience (one day it may exist if CCTV continues to spread). In one way I think the Internet may have a more pervasive effect.  I remarked that the Internet is quotation-driven: but it doesn’t just quote, it comments. You could say, I think, that the essence of Web culture is to display something (text, picture, video) and provide comment in parallel (I suppose I’m exemplifying this as I describe it). I suspect that as time goes on reality will come to seem to us like the thing presented and our thoughts like the comments.  Our consciousness may end up seeming like a set of lengthy footnotes. Perhaps David Foster Wallace was way ahead of us.