Introducing Stories from the Ivory Tower: a commissioned series of articles that draw their inspiration from academia. First up – Matt Harnett examines books, bits, and what Internet is doing to our memory.


When I was twenty-five, I began to suspect I wasn’t as smart as I’d been at twenty. It was a few tiny things. Where a couple of repetitions had once been enough to commit a phone number or some lines of text to memory, it now took a dozen, two dozen. I didn’t feel as sharp as I remembered being, as quick or fast to learn new things. I’d formerly spend hours at a time reading novels or studying, but now I couldn’t keep still for more than a few minutes. My mind wandered and I became trivially distracted, looped like an ancient scratched record, skimming the same paragraph over and over.

I panicked. Maybe, I thought, I was drinking too much. Maybe I was depressed, or stuck in a rut, or otherwise rudderless. Possibly it was early onset dementia, or some godawful neuro-whatever disease. Maybe it was nothing except a potent mix of ageing and hypochondria, and this was just the beginning of a dimly terrifying descent into premature senescence.

It could’ve been any of those things, but it was definitely something else as well. This slow muddling is happening to you, too, whether you notice it or not. The internet and digital culture weren’t conspiring to make me dumb – not exactly – but they were subtly changing the way I put together my thoughts, quietly tweaking the structure of my brain.

Our brains haven’t changed much in the 40,000 years since we stopped living in caves. Anatomically, you can’t tell the difference between the brain of a modern and a prehistoric homo sapiens sapiens. Despite this, there’s a growing body of evidence suggesting that any sort of culture has a profound impact on the physical structure of our most complex organ. The blank slate we’re born with is modified over time by our experiences. This has implications for our memory and the way we order our thoughts, our personalities, our very selves. Any change in our brain’s wiring affects the way we’re capable of thinking about things. Our hardware determines what software we’re running.

Somehow, though, our software – our thoughts – can also make meaningful changes to our hardware. We become better at things when we practise them, as the disparate parts of our brain responsible for carrying out a given task slowly get used to working with one another. Neurons that fire together, the adage goes, wire together. One cultural practice in particular seems to significantly rewire our brains, to prime us for thoughts of a different quality and texture, to expand our innate capacities beyond imagining: reading.

It’s strange, because although people are born to suck up language by hearing it spoken, there’s nothing in our evolution, according to French cognitive scientist Stanislas Dehaene, that “could have prepared us to absorb language through vision. Yet brain imaging demonstrates that the adult brain contains circuitry exquisitely attuned to reading”. The implication, he suggests, is that as the brain learns to read, it repurposes some of its pre-existing structure to more efficiently accomplish the new job we’re giving it. The tiny part of our mind that evolved to quickly identify the shape of a panther in shadow, for example, becomes enlarged and makes strong connections with the part that imbues meaning to language. Eventually, we recognise the letter ‘f’ and what it signifies. It’s a small, simple thing, but it’s also magic, and it changes everything.

In some ways it’s difficult to say what people were like before literacy, because by definition there aren’t any records from then. There are clues, though. In a given society, not everyone becomes literate at the same time, and those who are can write about those who aren’t: the philosophers of Greece, or the monks of medieval Europe. Then there are oral histories, passed down between bards until they’re finally inked or inscribed indelibly to paper or something like it. Think the Iliad, or Beowulf, or tales of Anansi the trickster. Finally, there are the societies that still exist without writing: a few scattered tribes in Africa, uncontacted peoples in Papua New Guinea or South America. Taken together, these sources begin to paint a picture of who we once were, and of how we once lived.

The internet and digital culture weren’t conspiring to make me dumb – not exactly – but they were subtly changing the way I put together my thoughts, quietly tweaking the structure of my brain.

If I were to wake up tomorrow illiterate but otherwise unchanged, the nagging concerns of my twenty-fifth year would seem trivial. I would be a fundamentally different person. I would value and speak in cliché, would string them together to create repetitive, redundant sentences. Not being able to write anything down, this would help me remember complex information. To quote theorist Walter Ong, “knowledge, once acquired, had to be constantly repeated or it would be lost: fixed, formulaic thought patterns were essential for wisdom and effective administration”.

If I were a scholar or storyteller, my success would largely depend on my memory. I might extend this frail faculty by employing techniques to help me remember things: I could create mnemonic lists or peg new information to existing well-established memories, like the furniture of my childhood home. These techniques are used today by card-counting gamblers, but they traded just as highly in antiquity. As ancient history scholar Jocelyn Penny Small puts it, “literacy and orality are an exchange that uses the currency of memory”.

The problem is amplified if my newfound illiteracy extends to those around me. Having no way to transmit information between generations but by reliance on our soft throats and softer minds, the cultural canon would dwindle to a hardened kernel, passed down through aphorisms and strictly patterned myths. Anything more complex would be stripped away by time and Chinese whispers. We would still appreciate beauty and style, but not the beauty of a well-drawn character or surprising, insightful prose. We would know the bones of every story told already, but some orators could transport us into them and speak them with unrivalled force and feeling.

Perhaps by living in the always present tense of an oral culture we would feel more, generally. Not constantly exposed to the passions of characters other than ourselves, we would be less able to regulate our own moods and exultations, or even have less desire to. It’s swings and roundabouts and silver linings. Literacy didn’t make us smart – not exactly – but it endowed us with the potential for genius.ŸŸŸ


The half joke, not really funny but reliably repeated when you travel someplace remote or run out of mobile data and you’re arguing a point with a friend but there’s no internet to prove you right (or wrong), is: “Shit, people used to have to live like this”.

Do you remember that time before Wikipedia, before Google? School projects would send us scurrying to libraries, fossicking through index cards, chasing references between obscure codices, praying the required volume of Encyclopaedia Britannica hadn’t been filched. Those were the Bad Old Days when knowledge lived in silos; when facts had to be copied one at a time, by hand; when we had to read and re-read then get mum to read our essays for spelling mistakes if we wanted a decent grade. Information didn’t want to be free, not yet: it wanted to be kept and consulted, like a clammily desperate mid-career lawyer – by schools, by governments, and most of all by our own fragile, fallible brains.

After the revolution, information comes to us begging, supplicatory, gratis. We face a problem that wouldn’t have made sense when data packets were bundled in pages rather than electrons or photons; a problem that couldn’t have even been formulated thirty years ago. We’re submersed in a crushing sea of information, and, as if to keep ourselves from drowning, we’ve simply decided to close our mouths.

Do you feel crushed? I submit you do not. My contention is you’ve developed a filter you’re not even aware of; a filter that serves you so well it’s difficult to accept you’re giving up something valuable, because that value is relative. You need a filter because more information exists today than on any previous day in human history, and much of it is garbage, and much that isn’t garbage is irrelevant, and much of the information that is relevant, and is of a high quality, you aren’t interested in knowing.

Take books. In the 1980s, an average of 80,000 books per year were being published in the United States alone. By 2010, that figure had more than quadrupled, but I would hazard a guess that the average reading speed has not. This glut of information informs how we choose to read and write. Our time becomes increasingly commoditised, and distractions become even more appealing in order to capture our attention for any length of time. Even prior to Flappy Bird, our mightiest institutions capitulated to this inevitability. In the forty years between 1944 and 1984, scientific papers underwent a metamorphosis so that experimental results became consistently foregrounded in the titles and abstracts, while actual method and procedure sections tended to be shortened and relegated to secondary status: ‘We Examined a Tardigrade With a Scanning Electron Microscope: What Happened Next Will Blow your Mind (and its Guts)’.

The internet accelerates this trend. We don’t have enough time to do everything, so we have to be picky. We read headlines, graze tweets, employ a type of reading variously referred to as ‘skimming’, ‘hyperreading’, and ‘bad’, and which is the antithesis of what we were forced to do in high school: read a single extended text over and over again to eke out meaning and structure and narrative.

At the same time, online writers know we do this and modify their prose to suit our new habits of mind. ‘Writing for the Web’ is a distinct new style characterised by short, choppy sentences, directness and a lack of floridity. It’s as if some terrible version of Hemingway is enslaved behind the curtain of every second-bit digital marketing firm.

We’re submersed in a crushing sea of information, and, as if to keep ourselves from drowning, we’ve simply decided to close our mouths.

We develop tools to fight this vaguely entropic confusion, but in the maelstrom they turn against us. For all the usefulness of Google and Wikipedia and our own new reading practices, data, once placidly organised, seems to breed. In the words of Penny Small, “each tool or technology makes it possible to deal more efficiently with the current accumulation of words, but by virtue of its success propagates yet more works that need yet more techniques to control them”.

With information now so logically organised and ubiquitous that it’s never more than a few taps or voice commands away, we don’t bother committing much to memory except the fact that the information’s out there, somewhere. We’ve forgotten what ignorance feels like, so we don’t value knowledge like we used to.


Concerning the technology of the personal computer, Umberto Eco wrote: “Once I know that I can remember whenever I like, I forget”. Less elegantly, Nicholas Carr penned an article in The Atlantic entitled ‘Is Google Making Us Stupid?’, from which he based a more recent book, The Shallows: What the Internet is Doing to Our Brains.

Disaffection with the digital age is currently in vogue among critics and academics, and fair enough. Even if it’s not making us ‘stupid’, the tendency towards brevity in writing and thought has implications on those parts of our lives we’re meant to consider with deliberative majesty: issues surrounding democracy and justice and economics. Sound bites are rightly reviled as an enemy to formulating sensible policy, yet the format of television and radio make them imperative, and it would be naïve to believe the internet doesn’t reduce our contemplative capacities in equivalent ways. The agnostic plasticity of the human mind is at once awe-inspiring and uniquely terrifying.

On the other hand, these fears aren’t exclusive to modernity. As literacy slowly osmosed through Greece, Plato had Socrates rail against the technology of writing, concerned that people “will not use their memories; they will trust to the external written characters and not remember of themselves”. Most would now agree that we gain more than we lose, but that’s the beauty of hindsight. Even Carr admits in The Shallows that, “the internet may have made me a less patient reader, but I think that in many ways, it has made me smarter. More connections to documents, artifacts, and people means more external influences on my thinking and thus on my writing”.

I’m probably still as smart as I was, but probably not in the same way. Maybe the subtle changes to my patterns of thought are helping me swim through an ocean of banal trash, even while they’re ruining my ability to remember birthdays without Facebook’s prompting. Maybe it’s possible to retain both sets of skills, or at least to arrest the atrophy before it spreads too deep. There are certainly promising apps and games that pledge to ‘train your brain’, and really do seem to have some effect.

Maybe one day soon the skills for both close reading and skimming will be taught in schools. It could be that, within a decade, considering a lifetime without the internet will become as abhorrent – as repellently half-human – as it is today to consider one without books.

Let’s not advocate an absolutism that proposes we can be made ‘smart’ or ‘dumb’, and that these are rigid states rather than silly labels sealing away the rich broth of human indeterminacy and flexibility. Let’s not imagine the Internet as some discount butcher of the mind, or of culture, or of democracy. If print literacy endowed us with the potential for genius, that’s not something the digital literacy of the twenty-first century can take away or elide.

Built upon the foundations of three millennia of print, we have to learn to embrace a new diversity of opinion without settling for the voice of lowest common denominator: to seek out and understand complex truths amid a broken ocean of info-garbage; to find fresh ways of speaking and being with one another that our grandparents couldn’t have dreamed of, and that our parents can’t trust.

People who dismiss the always-on discourse of digital culture as facile are missing the point. As Teju Cole said during a recent interview: “Being active on Twitter means that the literary part of my brain – the part that tries to make good sentences – is engaged all the time. My memory is worse than it was a few years ago, but I hope that my ability to write a good sentence has improved”.

It’s swings and roundabouts and silver linings, but we’re not going to walk away from the roulette wheel of neural plasticity without at least breaking even.


Matt Harnett is a graduate researcher at the University of Melbourne’s School of Culture and Communication. He enjoys travelling places and eating good stuff, and when he’s not making you feel bad about reading BuzzFeed he edits and writes for The Pantograph Punch.

Stories from the Ivory Tower is an online series of creative non-fiction essays that draw their inspiration from academia. These commissions are made possible through the University of Melbourne’s Cultural and Community Relations Advisory Group (CCRAG).