Books will soon be obsolete in schools. Scholars will soon be instructed through the eye. It is possible to teach every branch of human knowledge with the motion picture. Our school system will be completely changed in ten years. – Thomas Edison – 1913
You’ve got to give Thomas Edison credit for his vision. Of course, he had no idea of the cyberspace revolution which would grip mankind from the 1980’s onward, but he still wasn’t far off the truth. How many of us feel completely lost without our smartphone and a good WiFi connection? How many complaints are posted online as soon as an internet or TV provider encounters problems with their networks? And what does your bedside table look like? An array of USB cables and battery chargers, all plugged in to the same socket via a multiple extension cable?
“Need it, Google it, know it,” just isn’t true
It is not my intention to underestimate the beneficial impact that the World Wide Web has had on our daily lives, and will continue to have for years to come. But I have always considered the internet, and cyberspace in general, as being potentially harmful to the human brain. This view has been endorsed by researchers far more qualified than I am in this field. Namely, that you cannot replace knowledge with Google. The knowledge that I have acquired and stored in my brain is probably the one single thing that cannot be taken away from me, or even replaced. Think about it. You can easily have a hip replacement, or even a heart transplant. But a brain transplant? Even if such a transplant were possible, would you really want one? Probably not, because your brain is who you are and the knowledge and experiences intrinsically stored are an integral part of what you are.
But what is knowledge exactly, and how does it differ from information? I am going to pretend that I have been suffering from repeated headaches. I don’t usually suffer from headaches and I am considering making an appointment with my GP. But first, I enter the search term “headache” on the Google homepage. The search term I entered yields more than 92 million hits in 0.59 seconds, with answers ranging from tension headaches to cancer. How can I possibly gain knowledge on what my condition could be without a priori knowledge on how to search through medical papers and information. My brain needs to be trained for such a task, and, as in sport, I can only become proficient at it if I have properly trained. I need to know what is wrong with me, I’ve Googled it, but I am no closer to knowing anything about my condition.
Being bilingual delays the onset of Alzheimer’s by 5 years
The above example illustrates the importance of “brain training” for the ability to cope with an abundance of external information. The training, in fact, encompasses the whole learning process, starting on day 1 of an individual’s life. During this process, multiple new connections are made in the brain and result in “strengthening” of the associated brain areas. The language centres of the brain, located in the cerebral cortex, are a particularly good example of this. It stands to reason that you cannot teach somebody a second language if he has not already mastered a first language. What is maybe not so self-evident is that a bilingual individual will learn a new language much faster than a monolingual individual. I have had personal experience of this and can vouch that it’s true: I’m bilingual (English-French) and learned Italian, as a third language at school. I’m now living in the Netherlands and the only difficulty that I have with Dutch is actually finding the time to learn it. My wife is Peruvian, and I became conversant in Spanish without taking classes. I also remember that, as a child, I would spend hours listening to a local dialect that my French grand-mother spoke with her sister. It is a dialect that is seldom used nowadays and restricted to the region around Nice in the South of France. One day, they were talking about something, and I rather candidly exclaimed: “Hey, Gran, I can understand what you’re saying.” This being said, although I understand Italian reasonably well, I do have difficulty in speaking it, due to the lack of practice, having hardly spoken the language since I left school. As for the dialect, although I could understand it perfectly, I was unable to speak it.
The language learning ability underscores the fact that not only is it necessary for the brain to train in order to increase its functionality, but it also needs to “keep fit” by not leaving the different areas unused for long periods of time. It has also been shown that a fitter brain is less susceptible to degenerative changes. The brain, contrary to a computer, doesn’t suddenly crash, under normal circumstances. Instead, it undergoes a process politely known as “graceful degradation”. This occurs to us all, and is even occurring to me as I am writing this post. The secret to brain longevity is ensuring that your brain is at a high “fitness” level before it starts to degenerate, thus increasing significantly the time lapse between degeneration and the appearance of degenerative symptoms. In several studies, it has been shown that the onset of Alzheimer’s disease in bilingual individuals is delayed by 4-5 years, when compared to its onset in monolingual individuals. The disappointing news, at least for me, is that learning more than two languages doesn’t seem to prolong the delay.
Recently, researchers found surprising results concerning the deposition of amyloid plaques, associated with the onset of Alzheimer’s. When analysing the brains of 8 subjects who died in their 90’s, they found significant amyloid deposition in 3 of the subjects who had presented no clinical signs of the disease. Changiz Geula, a neurologist at the Northwestern University Feinberg School of Medicine, who led the study and presented the results at the annual meeting of the Society for Neuroscience in San Diego, was as surprised as anyone over the unexpected results.
“What’s significant about these findings is that they show there can be high densities of plaques and tangles in the brains of some elderly individuals who are cognitively normal or even superior.” C. Geula – 2016
A possible explanation for the results could be that the subjects who were “immune” to the onset of Alzheimer’s had brains that were significantly fitter and better trained, compared to individuals suffering from the disease. It is conceivable that their brains were degenerating at the same rate as affected subjects, but started to degenerate from a much higher level of functionality. Thus, these subjects might only have started showing symptoms at a very advanced age which they did not reach.
“Google is fast, Twitter is sweet, Facebook is fun, but not as fun as Pokemon, and do I have time for anything else?”
Despite its complexity, the brain resembles any muscle in your body. If you never flex your muscles, they waste, and the same could be said about the human brain. To ensure that the brain doesn’t waste prematurely, you have to make sure that you have made a large number of connections by the time your brain has reached its peak, at around 26 years of age. It is now known that even well into adult life, new brain connections can still be made, and old connections modified. The only way you can do this is by actively gaining knowledge and augmenting the scope and nature of your experiences. And this is where the dangers and failings of cyberspace appear. Internet, Facebook, and Twitter are full of information but void of knowledge. I would be curious to know how many school students actually copy and paste bits of information gathered from the net into their school essays. The information by-passes their brains and doesn’t transform itself into knowledge, unless they make a conscious effort to analyse and learn it. It wouldn’t be so bad if adolescents actually spent time reading on the internet, but most of them don’t.
Kathleen Taylor, a professor at St. Mary’s College of California, has studied ways to teach adults effectively. She believes that if the brain has already a large number of well-connected pathways, these can be maintained, tuned, and even augmented by continuing the learning process, and, more importantly, critically assessing what you have learned and confronting your views with the views of others.
The brain is plastic and continues to change, not in getting bigger but allowing for greater complexity and deeper understanding. As adults we may not always learn quite as fast, but we are set up for this next developmental step. (The New York Times, Nov 29th 2009)
In his widely acclaimed book, “Digital Dementia”, the German neuroscientist Manfred Spitzer believes that overuse of the internet and social media contribute greatly to the breakdown of cognitive abilities at an early age, especially those concerning short-term memory. Furthermore, other researchers have suggested that there could be a link between prolonged online action gaming habits and the onset of psychiatric disorders. This is related to a decrease in the amount of grey matter in the hippocampus, an area of the brain involved in short and long term memory, as well as spatial orientation. This reduction is associated with an increased risk for schizophrenia, post-traumatic stress disorder, depression and dementia.
What is most worrying, is the sheer amount of time that young people, adolescents in particular, spend online and the nature of their online activities. Numerous studies have shown that teenagers spend, on average, 7-9 hours per day on the internet. Most of this time is spent on social media where the accumulation of passive information and interactions is at its highest. Added to this, multitasking (performing more than one task simultaneously) and rapid browsing habits can lead to a substantial shortening in attention spans and a reduction in the ability to concentrate on longer and more complicated texts.
All in all, despite its sheer wonders, cyberspace has limitations and dangers. As with many technological advances, the limiting factor remains human in nature. Man’s relentless quest for more sophisticated technology and his thirst for an even greater enhanced cyberspace may prove to be responsible for an overloading of his brain with, paradoxically, fewer connections. This would eventually lead to a premature malfunction of the most beautiful and ingenious computer that man has witnessed…the human brain. Pokemon Go may be valuable for inciting teenagers to take a walk in the fresh-air, but as far as their brains are concerned, it’s maybe desirable that he be going and finally be gone from our smartphones.