The Death and Life of American Imagination


Jane McGonigal
is an award-winning designer of computer games and a passionate believer in figuring out ways to maximize the benefits of technology for a wired generation. On her blog, she writes: “I make games that give a damn. I study how games change lives. I spend a lot of my time figuring out how the games we play today shape our real-world future. And so I’m trying to make sure that a game developer wins a Nobel Prize by the year 2032.”

Maybe she will. Last year she completed her PhD at the age of twenty-nine and was named one of the world’s top innovators under the age of thirty-five by MIT’s Technology Review. Now she teaches at UC Berkeley and the San Francisco Art Institute. McGonigal wonders how people and computers could become so connected that collectively they would act more intelligently than any individuals, groups, or computers. She’s deeply interested in French philosopher Pierre L´evy’s theories in “collective intelligence,” and the impact of internet technologies on the cultural production and consumption of knowledge. L´evy has argued that the internet should “mobilize and coordinate the intelligence, experience, skills, wisdom, and imagination of humanity” in new and unexpected ways. He also predicted that we are passing from the “Cartesian cogito”—I think, therefore I am—to “cogitamus”—we think, therefore we are.

McGonigal herself believes there’s nothing inherently more imaginative about staring at a tree or drawing with chalk on the sidewalk than there is in confronting a virtual-world monster or creating digital graffiti. “I can give you one or two research findings that say otherwise,” says Stuart Brown. The founder of the National Institute for Play, Brown’s recent efforts have focused on the relationship between play and brain development in humans. “You put a subject into a CAT scan or MRI machine—which measure real-time blood flow into the brain—and you have that person looking at a virtual image, say a hand holding a ball, and then compare that to the person looking at an actual hand holding an actual ball.” What goes on in the brain, he says, is entirely different with each process. “The second one activates the frontal cortex and many other areas of the brain in a much more integrated way.” Virtual images stimulate the brain and stimulate imagination, he allows—but “it’s probably arousal without much integration with the whole of the brain.”

Brown’s point touches on a crucial area: The differences between an adult’s nervous system and a child’s. Children’s rapidly forming brains are unalterably influenced by the nature of their experiences. These differences are at the heart of the recent ban on cold medicines for very young children proposed by safety experts at the Food and Drug Administration, which has recently gone back on its assumption that children’s bodies are simply smaller versions of adult ones. The same goes for the assumption that TV and computer screens affect a child’s brain in the same way as an adult’s. Which is, in part, why the American Academy of Pediatrics urges keeping kids under two away from the TV.

Unlike adults, children do not choose their environments or experiences, or the cultural norms that literally determine the way their brains will develop. And so the developing imagination is at its most vulnerable in babies and toddlers, in grade-school children, in unfolding adolescents whose minds are malleable and open and at the mercy of whatever environment, whatever experiences we adults either provide or deny.

British historian Arnold J. Toynbee said that apathy can be overcome by enthusiasm, and that enthusiasm can only be aroused by two things: “First, an ideal, which takes the imagination by storm, and second, a definite intelligible plan for carrying that ideal into practice.” That kind of imagination is the cognitive fuel that put a man on the moon, and that could help forestall the wreckage of another Katrina. But the fate of the American imagination seems also to be governed by an old adage—one that is tricky for cognitive scientists and brain researchers to prove in context, even though it’s simple enough for any first-grader to grasp: If we don’t use it, we may lose it.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.