Millie’s vision became blurred and then she lost consciousness. Stressed and overworked, she took two weeks off, booked herself into a retreat, and slowly regained a sense of what she thought she’d lost. She’d been nothing more than an information processor for ten years.
Canadian philosopher Marshall McLuhan realised 30 years before the emergence of online search, social media, streaming videos, RSS feeds, blog posts and emails, that “we shape our tools and our tools shape us.” The person before technology is no longer the user afterwards. In ten years of using electronic media, Millie had changed.
“We are driven to fill our lives with the quest to ‘access’ information,” writes communications theorist Neil Postman in Technopoly, The Surrender of Culture to Technology. “For what purpose or with what limitation it is not for us to ask; and we are not accustomed to asking, since the problem is unprecedented. The world has never before been confronted with information glut and has hardly had time to reflect on its consequences.”
The average American consumes “information” 11.4 hours each day, according to a study conducted in 2009 by the Global Information Industry Centre. If Australia isn’t on par, it’s certainly a glimpse into our near future. The information consumed comes from myriad technologies that Western society has foisted on its people, starting with the printed word, the photograph, the typewriter, transatlantic cable, motion pictures, wireless telegraphy, and more recently computer technology and the Internet.
Since the advent of the Internet, the amount of information available for consumption has mushroomed. According to Robert McChesney’s Digital Disconnect, by 2010 the amount of data created on the Internet every two days was equivalent to all extant human cultural artifacts and information created from the dawn of time until 2003. We’re awash in information, so much that we can’t quite remember what we’re seen, heard or read.
Top Australian Brokers
- City Index - Aussie shares from $5 - Read our review
- Pepperstone - Trading education - Read our review
- IC Markets - Experienced and highly regulated - Read our review
- eToro - Social and copy trading platform - Read our review
Greek philosopher Seneca believed that memory is the essence of self. Our memories of what we do in a lifetime – whether we climb mountains bare foot, or sit in an ergonomic chair at a desk, become the scaffolding of our unique self. “As individuals express their life, so they are,” write Friedrich Engels and Karl Marx in The German Ideology.
But not all memories stick. Most memories sit in our short-term memory for just a few seconds, never to be thought up again. It’s our long-term memory that we rely upon for reflection, for deep understanding, and it’s here that we develop an idea of self. We’re defined by the moment when we saved a child from drowning at eight years of age; that we speak French as well as English; that we have a literary mind from studying literature at home.
Cognitive scientists point to long-term memory as the powerhouse of the brain. It is a vast reservoir with almost limitless capacity. With each expansion of our long-term memory comes an enlargement of our intelligence, writes Nicholas Carr in The Shallows. The very act of remembering, he writes, modifies the brain in a way that can make it easier to learn ideas and skills in the future.
Carr mentions a study conducted by German psychologists Georg Müller and Alfons Pilzecker. It takes an hour or so, the psychologists concluded, for memories to become fixed, or consolidated in the brain. “Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can sweep the nascent memories from the mind.”
Australian educational psychologist John Sweller is an expert on cognitive overload. Sweller believes that when too much is going on in our working memory, the successful transfer of information from working memory to long-term memory just doesn’t happen. Eastern philosophers might liken it to the cup that’s already full, whereupon more water poured on top simply cascades over the edge. To prevent cognitive overload, no more than four pieces of information should be processed at any given time, but the optimal amount is probably less, at two, thinks Sweller. Less is more when it comes to memory retention.
The glaring problem for educational theorists is that distraction is the enemy of learning, and most technologies are what Carr refers to as “interruption machines”. Reading online comes complete with hyperlinks, banners and images; we encounter constant updates from feeds, posts, instant messaging, and notifications.
If, according to Sweller’s study, we optimise learning by limiting our input to some two pieces of information, then what do we make of the technological disco parlours that Western civilisation has invented over the course of time? What are the consequences for our long-term memory? What are the consequences for our self?
When the key to memory is attentiveness and deep concentration, when we must give ourselves time to form connections with periods of rest and contemplation, what do we make of the Internet, the television, the radio, and all these other forms of technology that have been thrust upon us with no thought to the human being at the helm?
Postman argues that the tie between information and human purpose has been severed. We live in an age of technological progress, but no longer human progress. “Information appears indiscriminately, directed at no one in particular, in enormous volume and at high speeds, and disconnected from theory, meaning and purpose… Information is dangerous when it has no place to go, when there is no theory to which it applies, no pattern in which it fits, when there is no higher purpose that it serves.”
As a good citizen, Millie had accommodated herself to the various technologies at her disposal. She had watched television from home, made and received calls on her mobile phone, worked on her desktop computer, surfed the net on her laptop, texted on her smartphone; she had uploaded and downloaded photos and videos, cut and pasted, friended, followed, liked, pinned, and she had experienced “rich media.” But while Millie worked with all of her 21st century tools, what had become of her?
If the art of remembering is the art of thinking, as philosopher William James once said, then what happens when our memory becomes so sorely impaired by information overload that we can’t remember a thing?
After 10 years on the information front line, who was Millie?
This article first appeared in New Philosopher magazine, a quarterly print publication for people who like to think.