If you read the newspapers of the early twentieth century, you realize that everyone was fretting then about the “horseless carriage.” They were positive that the new technology of an automobile that drove itself would push humans beyond their natural, God-given, biological limits. They worried it would not be safe because human attention and reflexes were not created to handle so much information flying past the windshield. That debate reached a crescendo in 1904 when the Hollywood film director, Harry Myers, received the world’s first speeding ticket for rushing down the streets of Dayton, Ohio, at death-defying speed. He was going twelve miles per hour. By 1930, pundits had calmed down about the automobile being too fast for the human brain and human reflexes but then Motorola came up with a handy new invention called the dashboard radio and that started a new round of worry. How could anyone pay attention to the roadway with music or commentary or radio soap operas distracting their attention? So now, again, we are at one of those moments of rapid technological change when attention again has our attention.
There’s not only texting while driving (and, I admit, that does seem stupid) but texting in general, playing video games, spending time on social networking sites instead of reading, The Brothers Karamazov, and virtually anything else you can imagine. Journalist Virginia Heffernan, in a very sane op-ed piece in The New York Times called “The Attention-Span Myth” rightly notes that the whole idea of an attention “span” is dubious. Attention doesn’t span anything. It is always on, always moving between one thing and another, diverting itself all by itself (as we know from the 5000 year tradition of meditation in Eastern religions) even when there aren’t dashboard radios or iPhones to do it for us. But Heffernan’s article was followed, a mere three days later, by a much longer one by Pulitzer Prize-winning journalist, Matt Richtel, called “Growing Up Digital, Wired for Distraction,” which seemed to put forth the idea that brain biology has an attentional “set point” and we are busy exceeding it.
I like the tone of Richtel’s article very much. He has gone far past the punditry of the last five years where everything was awful, kids were going to the dogs, the future would be destroyed by distracted, no-nothing “millennials” and so forth. He asks honestly and genuinely if kids are “wired” or even “rewired” for “distraction.” I’ve already written a longer blog as a response and, if you’ll forgive the shameless self-promotion, I have a trade book coming out next summer from Viking Press on this exact topic (Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work and Learn) but, here, simply, want to say that, yes, definitely, kids are being “wired” by the technologies they use because learning, all and any learning, is what shapes the brain. The more you experience certain patterns, over and over, the more efficiently neural pathways are shaped and, more importantly, others are sheared.
This is the famous Hebbian principle: neurons that fire together, wire together. What that means is, if I’m driving down the road in my automobile with the radio on, I’m really only paying attention to a fraction of what is relevant to my driving the car at that very moment. If a dog darts in front of my car, I automatically slam on the brake. If my neural pathways for “bringing the car to an emergency stop” weren’t well trodden, I would have to carefully think about the engineering dynamics of car stoppage, the neurophysiology of foot lifting, the causality of foot to brake pedal, and then the energy to make all that happen. I don’t. I simply slam on the brake.
In times of great technological change like our own, when many of us feel challenged by new ways of responding to the world, it is natural and normal and good that we worry about what the change is doing to our children. But their video games and texting are the best possible preparation they could have for their digital future. We have to unlearn old patterns before our neurons lead us sleekly and rapidly to an effortless interface with new technologies. If a child is practicing on his Gameboy or Pokemon game, he is learning how to interface with the world without even knowing he’s learning. There is no more efficient way to be shearing those neural pathways in order to maximize every future interaction with the familiar. That which is automatic is what we build upon. It’s how brain biology works. To pay attention in a digital age, some of us may have to unlearn patterns, practice new ones, and work our way towards cognitive efficiencies that we call, in common parlance, “attention.” The kids are all right. They are arriving at the same attentional destination in the best possible way: as they play, they learn about the world that they have inherited and, some day, will help to shape.
Banner image credit: h.koppdelaney http://www.flickr.com/photos/h-k-d/3676335098/in/gallery-45394620@N06-72157624428541720/