Tag: children

  • Suffer the Children

    The holiday spirit had barely dissipated last month when close to one-hundred-fifty people took to the streets to protest budget cuts for early childhood education. One protester was apparently so distressed by the lack of resources that she wailed and threw herself on her knees. Others tried to help her up, but she let her body go limp like an obstinate child. She was, in fact, four years old.

    All told, about two-thirds of the marchers had yet to see the inside of a kindergarten classroom. Clad in orange and sporting “Early Start” and “Strong Finish” signs on their chests and backs, respectively, the preschoolers, along with numerous chaperones, paraded down Nicollet Mall in downtown Minneapolis, the youngest riding in carts pulled by teachers, parents, and volunteers from the YWCA Children’s Center at 12th and Nicollet.

    Despite the goal that adults professed for the protest, the children seemed more concerned with peace. Many of them wore white satin headbands with that word spelled out in glitter and (except for the aforementioned activist) chanted, “We want peace. We want peace” as they skipped and jumped.

    “It’s really more about promoting civic engagement,” admitted Ellen Cleary, a YWCA development specialist, by way of explaining the confusion. When a reporter tried to get a straight answer from various marchers, they responded with the usual indignation, suspicion, and evasiveness, as if they had spied an infiltrator in their midst. One girl impudently thrust out her sign and contorted her sweet little face into a derisive “What—are you stupid?” expression. Another coyly smiled and looked down at her frosty feet, as if to suggest that she was marching for the right to winter boots. A three-year-old boy let out a shriek, buried his face in a nearby shoulder, and refused to answer. After the march, when questioned, four-year-old Nora ran and hid under a table.

    Protected by her gray laminate canopy, she was a little more forthcoming about what she was marching for. “Peace,” she said. And what is peace? Nora giggled and ran for cover again, this time into the arms of a YWCA volunteer. “Do you want to tell?” asked the volunteer. “No!” Nora insisted, and wriggled free of one more interrogator.

    The action on Nicollet Mall, organized by the YWCA of Minneapolis in honor of Early Childhood Education Awareness Month, was one of four protests (each near one of the nonprofit’s locations) to publicize five years’ worth of budget cuts for state childcare subsidies. According to the YWCA, with fewer low-income families qualifying for subsidies and facing higher co-payments, many low-income children are now deprived of early childhood education and some childcare centers have had to close.

    Becky Roloff, CEO of the YWCA of Minneapolis, attempted to kick off the downtown event with a brief statement. With several news cameras trained on her, she fought to be heard over the roar of restless children. “We are marching to tell everybody how important it is that all of you go to school and get an education like I got an education,” Roloff explained to her young audience. “We are doing this so that we can give you a good start, so that you can do well in school, and for the rest of your lives.”

    Without a microphone, however, Roloff’s message was no match for the din of a hundred youngsters ready to take it to the streets. The cameramen asked her to do another take—but not before Sarah Warren, an eager protest organizer with a drum, took a wrong cue. She began rallying the children to shout, “Early start, strong finish!”

    Though Roloff attempted to give the media what they wanted, revved-up children have a way of getting their way. There was nothing to do but lead the kiddie caravan out of the YWCA and into the cold.

    “What do we want? Peace! When do we want it? Now!” Thus began the mixed-message march as the group set off on its three-block trek down the Mall.

    Two blocks along, one mother, clearly accustomed to more aggressive demonstrations, spotted an approaching police car; she froze on the spot, as if bracing herself for the tear gas. The nearby nippers continued, oblivious to the threat.

    “We want peace. Hands are not for hitting,” they sang. Girls twirled. Boys jumped. Energy soared. And one lonely tear welled up in a reporter’s eye, while other passersby, in classic Minnesota fashion, seemed entirely oblivious to the spectacle.

  • Stupidity on Two Wheels

    So, it sucks to park
    the car on Hennepin Avenue in the winter – scaling the piles of snow hardened
    into ice, trying not to fall against (or under) the filthy auto, hoping that
    busses and SUVs will not take the car door off (or at least slow down if they do) when you get get the frozen lock
    unlocked … it especially sucks getting
    into/out of a car parked on Hennepin when you’re toting a 10-month-old, however
    good-natured, and all of his attendant baggage. It sucks to do this at least
    twice daily, which you do when you don’t have any other place to put the car.

    But that’s not the
    source of my outrage for the purposes of this here post. The outrage was sparked
    the other day, once the baby and I were safely settled in the car (frozen, poorly designed car
    seats in frozen cars … there’s a topic for another post) and driving this car
    on Hennepin toward Calhoun Square. Shortly we came upon a woman on a bicycle.

    That’s not the
    source of my outrage, either. I’m totally pro-bicycle. I especially have to hand
    it to people who ride their bikes in winter – we should all be so virtuous. But
    I do have to mention that people who bike on icy thoroughfares like Hennepin –
    sans helmets – are, in a word, nuts. Or stupid. Hennepin is already narrow, and
    it’s made narrower still with those aforementioned frozen snow piles on either
    side. And if the conditions are icy for cars, might they be even more so for
    bicycles?

    But I’ll hold back
    on the outrage there, even. As a pro-bicyclist, I believe that bicyclists own
    the roads, too. They can ride wherever they want, and if they want to take
    their lives into their own hands by not wearing a helmet and by riding on busy,
    icy streets, that’s their business.

    So
    where is the outrage already? OK: The outrage comes in because there
    was also a child riding on the bike, behind the woman (who was, let’s
    presume, the mother of the child).

    A bit more outrage
    comes because this mother had apparently decided to make things "safer" for the child by putting a helmet on her – but
    she wasn’t wearing one herself. So your kid can become a motherless quadriplegic
    but hey, at least she might possibly retain some or even all of her mental
    faculties should a collision occur on icy, busy,
    narrowed-by-frozen-piles-of-snow Hennepin.

    This
    idea of helmets-for-kids-but-not-their-parents is akin to another source of outrage: How
    politicians fall all over themselves to be pro-health insurance for
    children — but not their parents. So little Susie can have her annual checkup
    and/or cancer treatment, but Susie’s insurance-less mom? She might die because
    she ignores some health problem or
    she might go into financial ruin dealing with said health problem, because she doesn’t have health insurance — leaving Susie
    physically healthy (by some standards) but motherless and/or living in abject poverty.

    But I digress,
    having delivered most but not all of the outrage. There are a couple more bits.

    Bit #1: Watching as this helmet-less mom-with-child runs a red light on her bike on
    the aforementioned icy,
    busy, narrowed-by-frozen-piles-of-snow Hennepin.

    Bit #2: Spotting the same mother and child, 45 minutes later, riding the other way on Hennepin —
    and this time, they’ve taken on another pint-sized passenger. Somebody get me a
    Rolaids.

  • Don't Call Me Sweetheart

    I just picked up my two children from their school-supported daycare, at which time a young woman put her finger in the air and motioned for me to have a moment with her. I stepped aside and proceeded to listen to her complain that my son had been calling her, and other students and teachers, "sweetheart." She told me that she and some of the other students did not appreciate it, and that this behavior was unacceptable.

    I know my face must have resonated with a "you must be fucking kidding" look. Sweetheart! He’s five. If it is not endearing and humorous, it certainly cannot be very disruptive.

    I wanted to to tell her that "bitch" and "my ass" are bad words, but "sweetheart" has no malice. One can deduce bad intent if it comes from a greasy man at a bar, but my son is a cute 43-pound Guatemalan boy. (By the way, when I asked him to get on the scale, he said, "OK, honey.")

    I am so taken aback that this women has nothing better to do than to rat on a little boy who is trying to be funny. I informed her that there are a lot more destructive behaviors to focus on than a child saying "sweetheart." This daycare worker just would not let it go. She argued that if she or any children did not want to be called sweetheart by my son, he should follow their wishes, and that this needed to be addressed.

    I don’t know what makes me more angry — the fact that this woman is just being STUPID (a word my son is not using) or the long-term effect of not letting a five-year-old be OK with who he is. What’s next? Not using the word "love" or "friend"? No, I know: the word "honey."

  • The Greatest Threat to the Imagination

    Jeannine Ouellette‘s puzzling article seems to cite
    the regimentation of children’s lives and the role of technology as a
    threat to the development of imagination. As a girl in the ’50s and
    ’60s, I faced far more restrictions to my imagination and free play than
    any kid today.

    But the greatest threat to imagination goes unmentioned: the intrusion
    of religion into the schools. It may not seem so bad here in
    Minneapolis, but there are parts of the country where the schools are
    NOT focused on STEM (science, technology, engineering and math). They
    are afraid to teach anything that might threaten third century AD
    notions of cosmology or biology. There is a brain drain due to
    restrictions on research (stem cells, etc) and govermnent science is
    censored on the subjects of reproductive health and climate change.

    Minneapolis doesn’t have to do all this to limit the development of its
    children, however. It’s school board has merely decreeed that eduction
    be withheld from anyone not rich, not white, or not a resident of the
    southwest quadrant of the city.

    Deb Cochran, Minneapolis
    Letter

  • The Death and Life of American Imagination

    In February 1953, a violent North Sea storm crashed through the Dutch levee system, killing 1,835 people and leaving a hundred thousand others homeless. In the aftermath, the country responded by building the Delta Works, the world’s most sophisticated system of flood defenses. According to John McQuaid, a reporter for Mother Jones on assignment in the Netherlands, the system is “engineered to a safety standard 100 times more stringent than the current goal (not yet achieved) for New Orleans’ most heavily populated areas. Even Dutch pasturelands have more protection than the Big Easy.” As one government engineer told McQuaid, conceiving and building the Delta Works “was like putting a man on the moon.”

    That was half a century ago. Why the disparity between what the Dutch could accomplish then, and what the U.S. (the country that did put a man on the moon) has conceived to protect New Orleans, one of its most historic and treasured cities, and the surrounding region? You can call it foresight, or innovation, but beyond that, what the Dutch response required—and where we appear to be failing in our response to the aftermath of Katrina—was tremendous imagination.

    Imagination is an intangible, unlimited, and free resource. It is not, at least for the purposes of this discussion, the same as fantasy, where universal laws cease to apply, where elephants might speak Latin or humans travel back in time. Nor is imagination reserved for artistic pursuits, though imagination is the core of creativity. Applying imagination to problem-solving requires the ability to come up with an idea, and to break that idea down into the steps that will bring it to fruition. It also requires an alchemical mix of will, vision, discipline, and action, not to mention stubborn perseverance in the face of frustration or opposition.

    A prime example of this use of imagination would be George Hotz, the seventeen-year-old who spent all summer cracking Apple’s iPhone; he broke the lock that tied the phone to AT&T’s wireless network and freed it for use on other carriers’ networks, even overseas ones. Hotz spent five hundred hours with four online collaborators, and was motivated by the challenge and by “fun.”

    Presently, imagination of this sort is very much in demand. One wake-up call to the erosion of imagination in American culture came in 2004, when “failure of imagination” was cited in the 9/11 commission report as the primary reason U.S. officials misjudged the threat of the 2001 terrorist attacks. Maybe government officials couldn’t imagine terrorists flying planes into the World Trade Center, but plenty of others could and did—and not just those who actually carried out the long-planned and highly complex attack. The ability to prevent terrorist attacks depends on leaders who are as imaginative as those who would carry them out.

    While imagination is one key to national security, it’s also crucial to economic security. In 2004, executives at leading technology companies like Dell, Cypress Semiconductor, and IBM spoke to Lee Todd, president of the University of Kentucky, about creating sustainable jobs for the U.S. in the years to come. All said the same thing, according to Todd: Imagination and creativity represent the future of the U.S. economy. On a broader level, the World Economic Forum chose “The Creative Imperative” as the theme for its 2006 conference in Davos, Switzerland. Writers like Daniel Pink, author of A Whole New Mind, point to the new “imagination economy” as a trend that’s just taking off. He sees it in quite simple terms: “People have to be able to do something that can’t be outsourced,” Pink told me. “Something that’s hard to automate and that delivers on the growing demand for nonmaterial things like stories and design. Typically these are things we associate with the right side of the brain, with artistic and empathetic and playful sorts of abilities.”

    Government leaders in education are joining the chorus, too. “American education’s single-minded focus on science, technology, engineering, and math (‘STEM’ subjects) is admirable but misguided,” wrote two former assistant U.S. secretaries of education in the August 12 issue of The Wall Street Journal Online. What makes America competitive in a shrinking global economy, they claimed, is “our people’s creativity, versatility, imagination, restlessness, energy, ambition, and problem-solving prowess.” As they summed it up, true success—economic, civic, cultural, domestic, military—depends on a broadly educated populace with “flowers and leaves as well as stems.”

  • Crazy

    This is a story with a hopeful ending. Lucky, even. But be forewarned, you have to get through a lot of hopeless, unlucky crap before you find it.

    Here’s how it all starts: My first-born son has autism.

    Now that isn’t hopeless or, in my opinion, unlucky. Autism isn’t sick or crazy. It’s rigid and routine, a little eccentric. Autism is multiplying columns of numbers easily while being unable to look anyone in the eyes; listening to only one band’s music, and always in the same order, for a period of six weeks; refusing to eat anything orange. It’s also being able to remember the exact date and time you ate a bison burger in Chamberlain, S.D., when you were six. But there’s a really charming side to all this, a wonderful tilted perspective on life that, if you’re a parent of autism, you come quickly to enjoy.

    I was a parent like this.

    Until he was 17, my son was unique and funny and odd. He was difficult in some ways but incredibly easy in others. He washed the family’s dishes precisely, went to bed at exactly the same time each night, and sorted our mail into careful piles. He did fairly well in school—above average in math, a little below in social studies—and spent his weekends playing tournament-level chess. He was a loner, but sweet and articulate and very close to his only brother.

    Then junior year came. He met a girl, he went to a dance, he thought life was better. And for a night it was. Then the dance ended, the girl decided she was interested in someone else, and the boy became depressed.

    Was this cause for alarm? I thought not. Teenage boys routinely get depressed over girls and fickle friends and school dances. It was painful, but I assumed it would blow over. When it didn’t, after six months, I took him to a psychologist who recommended a psychiatrist who put him on a newfangled antidepressant she said would have the added benefit of controlling some of his obsessive tendencies, like stacking the dishes and sorting the mail.

    I didn’t want to control those things—to me, these weren’t symptoms, they were characteristics of my son. And I’d fought for 17 years to keep him drug-free. But the psychiatrist and the psychologist and several family members insisted: He’d become unhappy, his routines were getting in the way of his developing a social life. This pill, they said, would help him.

    Instead, he gained thirty pounds and began to lose his mind.

    It happened slowly, over a period of months. First his grades began to fall. There were some random episodes of violence—nothing major, just an out-of-control moment here or there. A tendency to stand up from the dinner table, after a full meal, and walk to Arby’s for a snack. Eerie giggles that seemed involuntary. A flat expression on his once-curious face.

    Senior year, he started an after-school job at an auto parts factory but lost it when he couldn’t keep up with even the elderly workers. He stopped speaking to his brother entirely and even hit him several times. He lost interest in music, computers, and chess.

     

    I talked all this over with his father, my ex-husband, who said, “Maybe he needs a man’s attention. Let me give it a try.”

    So our now eighteen-year-old, autistic, depressed, and quickly losing ground, moved across town, to live with his father in a small, quiet apartment. My ex worked odd shifts, so our son began wandering the city on foot, early in the morning and late into the night. He told his dad about how he had to fight the bad thoughts that were crowding in his head. And when he wasn’t out walking, he slept a lot—around two-thirds of his life, in fact—despite the fact that he drank twelve to fifteen cups of coffee a day.

    Together, my ex-husband and I took our son to a highly respected neuropsychology clinic housed in a suburban office building. The doctors there even looked like bankers; they wore regular clothes and carried clipboards and fancy pens embossed with the names of drug companies, rather than stethoscopes.

     

    After meeting our son twice, they conferred with the original psychiatrist (who, we discovered later, was employed by the same large healthcare conglomerate) and came up with an altogether new diagnosis. This wasn’t autism at all, they told us, but “psychomotor slowing”—a form of schizophrenia. Our son was just unlucky, they said sadly, the victim of two devastating neuro-behavioral disorders. Completely unrelated.

    It was critical that we begin treating him immediately; they couldn’t stress this strongly enough. We were given a prescription for a brand-new antipsychotic medication with the inspiring name Abilify that was direct-to-consumer advertised in Newsweek and Time magazine. It featured a woman gazing into an azure sky and copy promising the drug would work on the brain “like a thermostat to restore balance.”

    We were skeptical. But the experts were firm: He would continue to deteriorate if we didn’t catch this now. Did we want our son to end up institutionalized? In jail? Sick to our stomachs and desperate, we gave him the drugs. Then he got much, much worse.

    He stayed with me on weekends, and twice during the workweek he would come to my house for dinner. We would sit at the table—my husband (his stepfather), his brother and sister and I—but my once-reserved older son would only stand over us acting crazy. Humming, shifting foot to foot, screaming if anyone touched him or tried to move him to the side. Often, he would talk back to the people who were speaking to him inside his head, telling him to do things. He would not, however, say a word to us.

    He wasn’t eating meals. But he was eating—constantly. After graduating from high school, during the period when he was still holding the voices at bay, he’d started a government job through a disability work program. I’d given him a car and helped him open a checking account during this period of lucidity. Now, he began stopping at fast food restaurants on his way home from work to consume nachos, burgers, brownies, and lattes. He ate with his hands and wiped them on his clothes, which he’d quit washing. He stopped bathing altogether.

    We discontinued the Abilify, tapering it off as directed. Two days after taking the final pill, he got out of bed at 2 p.m. and stood in one place for a solid hour. My husband had taken our daughter roller-skating; our younger son was at work. It was just me, alone with this six-foot-three-inch man I’d given birth to but no longer knew. I put my hand on his back and tried to push him forward, toward his shoes. And he turned to look at me—his eyes empty and cold—then grabbed me by both arms and beat me until the neighbors heard me screaming and called 911.

    You think you know what crazy is, but you don’t. Not unless you’ve been there.

    In the movies, it might be depicted as quaint or flat-out violent. But whichever way it goes—Hannibal Lecter or the wacky old ladies of “Arsenic and Old Lace”—crazy is portrayed as consistent, interesting, narratively coherent. Not so in life.

    In reality, crazy is like war. It’s tedious for long periods of time, until it turns around and is devastating. It’s random, senseless, all-consuming, financially draining, destructive, ugly, sickening, and gross.

    It’s standing in the front yard wearing nothing but torn underwear and trying to control the thoughts of people who drive by. It’s saying yes to every question, no matter what the real answer. It’s drinking compulsively, straight from the faucet, then spewing a stream of clear-water vomit like a geyser.

  • Specimen Days

    Boys will be there but your parents will not,” promised the summer camp brochures that came in winter’s mail like seed catalogs. There were pamphlets for marine biology camp in Florida, space camp in Alabama, and some sort of geology road trip called the Central Rocky Mountain Institute. “I hear scientific greatness calling me,” I said to my parents, handing over the stack of glossy pictures and application forms. “It’s for my education,” I insisted.

     

    “Education” was the sort of trigger word that could induce a highly suggestible trance state in my parents. I could have used this knowledge for evil purposes by turning them into middle-class zombie assassins. But instead I got them to write a check for the road trip science camp, and the state of Wisconsin enjoyed another year’s reprieve from the destabilizing effects of political assassination.

    I would be caravanning with two dozen sixteen-year-olds and a handful of counselors, trekking from our home in Wisconsin to the wilds of Montana, exploring geological points of interest along the way. It was intriguing: How could a point be both “of interest” and “geological”?

    The trip would be my third and final science-camp experience. “Serial Science Camper” was not an instinctive niche for me, personally. If Amnesty International had run human-rights youth camps, it might have been a better fit for my particular enthusiasms. Or if my parents had splurged on installing cable television in August, sitting on the couch watching TV might also have been a good fit. As it turned out, my fellow science campers were just like me. Perhaps not quite as cool or self-assured as the teenagers attending camps for pom pon or basketball, they were definitely on more solid social footing than RenFair types who went to band camp. But the true future geniuses of the Midwest didn’t show up, sleeping bags at the ready, when it came time to answer the call to muster. My guess is that they imagined the relentlessness of the peer contact involved and decided to take a pass.

    The counselors at science camps tended to favor detached observation and note-taking over cheerful boosterism, and I found them to be refreshing counterpoints to the capture-the-flag-loving, sunny-day-hooray! counselors I had crossed swords with during my years at YMCA camp. Y-camp counselors were known to blow whistles while forcing a person to swim farther or run faster, but science-camp counselors refrained from unnecessary noise or motion so as not to scare off a possible specimen. At worst, a science-camp counselor could only bore you, and even then you could just relax and stand there while it happened.

    The first of the camps I attended was held at Pigeon Lake Field Station in Wisconsin’s Chequamegon National Forest. It was a lot like college: We could sleep in, skip breakfast, and then drag ourselves at the last minute to a class that had seemed fascinating when we first signed up for it but in which we quickly lost interest. We dwelt in a forest—a place of gentle shadows, low roofs, screen doors, and instant best friends. We tromped over pine needles and ducked under pine boughs when it was time to learn orienteering or venture out to the nearby bog. My favorite class was the mysteriously titled “Avian Aftermath.” After we took our seats, each student was issued a pair of tweezers and an oblong, fuzzy, gray lump called an “owl pellet” to dissect. As we pulled apart the hairy mass, slender white lengths of bone appeared. These we plucked out of the pellet and delicately placed to the side. When this dissection had reduced the pellet to a large pile of fluff, a stack of tiny ribs, spines, and skulls remained: the inedible parts of the owl’s supper. Our instructor then taught us how to identify the mice and moles by their ingested skeletons.

    One night we hiked into a dark part of the woods, two by two, and placed Wintergreen Lifesavers in our mouths before turning toward a partner and extinguishing our flashlights. On the count of three, we all crunched down on the mints, and were impressed to see blue-green sparks flickering in one another’s mouths. The triboluminescence heightened with increased friction. Wanting to put on a good show, we pulled back our lips and gnashed our teeth violently.

    The next fall, I took part in Trees for Tomorrow, a name that will be familiar to the tens of thousands of students who have passed through the Eagle River, Wisconsin, campus. The program was held over a long October weekend, in a harsh, wet woodland where gray skies misted the unfortunate with a ceaseless drizzle. We trudged to local lumber mills and learned the finer points of forestry “resource management.” Recalls alumna Kristy Robb, perfectly capturing the thrill-inducing gestalt of the place, “We didn’t have enough warm clothing to be standing under a tree for what felt like hours, hearing someone drone on about the damned tree.”

    By the following summer, a combination of hormones and pop culture had dulled my ability to focus during lectures. Luckily, the Central Rocky Mountain Institute was heavy on hiking and exploration. After parents had dropped off their campers in a central Wisconsin parking lot, the counselors confiscated all the Walkmen and informed us that both napping and listening to the radio were verboten during the all-day van rides. “I gave up smoking for this?” I thought. But with nothing else to do, we entertained one another with jokes and stories until we were as close as cousins—kissing cousins, in some cases. We slept in cramped and malodorous tents, cooked our food in a makeshift mess hall, and endured limited access to running water. We always carried canteens, knives, rope, and bandannas. In Montana, we marched up the mountain every morning and tripped down the mountain every night, ankle deep in wildflowers and singing like von Trapps.

    And the geological points of interest were a wondrous backdrop for our bonhomie. No glacier-wrought handiwork went unappreciated. Every volcanic nip and tuck was celebrated. We explored kettle moraines, camped in the Badlands, and wandered through Yellowstone’s geothermal funhouse. A few times we went digging for fossils, and we were allowed to keep our findings, which was a rare treat. Our instructors’ reverence for conservation occasionally conflicted with our natural teenage urge to vandalize. When some of us girls picked flowers to adorn our greasy, unwashed hair, we were tensely reprimanded, “Collection of botanical specimens is not allowed in national parks.” When we were returned to our parents, sunburnt and ravenous, we had a new appreciation for … well, for being away from our parents, I suppose.

    But like the mammoths whose bones we marveled at, the age of the co-ed science camp was coming to a close. I attribute it in large part to Time’s 1982 Man of the Year: the computer. As I grew out of summer-camp age and into adulthood, the concept of “science” became tethered to computers, and computers were tethered to the electric grid. Computer-camp programs sprung up in the vacant lab spaces that universities could spare during the summer, and their success confirmed that students would accept a science camp divorced from actual camping. As the trend grew, math, engineering, and robotics camps blossomed, but ecology field camps withered. At the same time, the academic community recognized that boys were pulling ahead of girls in science achievement, so single-sex day camps were created to free girls from the distraction and self-consciousness of the co-ed classroom. Had I been born twenty years later, I would still have gone to science camp—but I would have left both my pocket knife and my lip gloss at home. It wouldn’t have been an adventure, and it wouldn’t have been nearly as much fun.

    Of the camps I attended, only Trees for Tomorrow, which gets support from “the forest products and electric utility industries” is still active (and, judging from the raingear-clad participants on its website, the fieldwork environment hasn’t changed much in the last fifteen years). The sylvan paradise of the Pigeon Lake Field Station is now available, on a rental basis, for conferences. The Central Rocky Mountain Institute, never much more than some dedicated teachers, a map, and a couple vans, is just a memory. No scientist myself, all I have retained from those summers is a partial mole skeleton and a knack for juvenile puns using the term “kettle hole.”

  • "The Sanctity of Marriage"

    "If I see that tie one more time, I’ll shoot myself.” My husband Jon was browsing through photos from our recent wedding, lamenting his last-minute decision to rent a tux rather than buy a suit. “Look at that,” he groaned. “Did I somehow not notice it was made of pressed plastic?” I laughed, but lightly, or he’d think I was laughing at him instead of with him. It’s a fine line, but finer still was the one we crossed by deciding to marry at all. When we stood near the stony shore of Lake Superior, the bees of late summer humming in the organza billows of my dress, promising to love, protect, and forgive each other forever, we’d already been living together for three years.

    In that time, we’d gradually transitioned from sharing a bed and a bathroom to merging our identities in other areas—bank accounts, credit cards, phone service, and, in what was a major late-fee liability for both of us, video rental. Which kid was whose (three from his previous marriage and three from mine) also demanded frequent clarification in the early days, when bloodlines ran deep and fast, and threatened to drag us all down to the slimy bottom and bury us there like rocks. While the kids by turns rearranged and refused to rearrange their bedrooms, their schedules, and their loyalties, Jon and I twisted ourselves like pretzels in our fervor to prove to our children that living together might not be such an unthinkable fate. As it happened, our contortions failed to convince our wary children. Only with time did our newly patched-together family begin to take, eventually leaving us free to contemplate the possibility of marriage.

    When finally we produced an actual wedding invitation, many people were confused. Jon’s eighty-six-year-old mom, who had stopped asking about our plans after the so-called “engagement” dragged into its third year, responded with happy shock: “For heaven’s sake,” she said. “I kind of thought you’d gone off and done it already.” Most others had figured the same thing—that we’d snuck down to city hall and signed some papers without fanfare, or that we (specifically, I) had decided to conscientiously object to the patriarchy, eschew marriage on principal, and cohabit forever.

    For years, then, before we snaked through the queues in the Hennepin County Government Center (to encounter what was, given our ages, a surprisingly snappish premarital counseling lecture from the blue-haired lady handling marriage licenses that day), we enjoyed and bemoaned every normal facet of married life, plus a few abnormal ones. Our eventual wedding day was less a beginning of something new than a ritual that affirmed the stable relationship we’d been establishing for years. The threshold over which we stepped was strictly metaphorical. Except, of course, we were legalizing our union.

    Now that my dress is back from the cleaners, and the sealed marriage certificate has arrived back from the county, I wonder. Does this piece of what appears to be recycled printer paper, solemnly signed by us and three friends (including one who performed the ceremony, because we belong to no church), change anything beyond our ability to add each other to insurance policies or unplug life support someday? Is marriage as sacred as it’s cracked up to be? In fact, is it sacred at all, if you said “forever” once but took it back and divorced after ten or fifteen years?

    Not if you ask those who blame no-fault divorce for the demise of the family. They say that when one spouse holds the power to walk away at will, marriage is downgraded from a lifelong commitment to one that lasts as long as either spouse “feels like it.” And it’s true that while reading wedding books for guidance in developing our own ceremony, Jon and I couldn’t help but notice how some of the newfangled vows—“as long as our love shall last,” or “while our marriage serves the greatest good”—seemed a little less ambitious than the old saw, “till death do us part.” Ultimately we couldn’t stand the notion of watering down a promise defined by its lifelong nature. We boldly vowed “forever” even though by doing so we underscored how short we both fell on that once already.

    We worried about creating a ceremony that on the one hand wouldn’t insult our own (or our children’s) sense of historical truth and authenticity, and on the other wouldn’t dilute or qualify our vows to the point of irrelevance. We were participating firsthand in a massive cultural discourse on the meaning of modern marriage, and we were neither first nor alone in our concerns.

    Worrying about the meaning of marriage is a preoccupation dating back thousands of years. Mutability in the rules and mores of marriage is also age-old. As an institution, marriage has always existed in a state of flux. But the cultural colloquy—what it means, why people do it, and who should be allowed the privilege—has probably never reached quite the pitch it has now. Policy debates, from the controversy about gay marriage to “marriage promotion” programs aimed at low-income families, have pushed marriage onto a battleground. And as impassioned warriors clash over who should be allowed access to the “sacred institution” of marriage, others watch with detachment and ask quietly whether the whole concept of marriage has fallen into a state less dramatic than collapse, but ultimately more deadly—obsolescence. Today’s most brutal fights erupt in the matter of same-sex marriage. But battles about who should be allowed to marry have always been vicious. The last major public outcry on marriage-partner selection only just died down.

    Newlyweds Richard and Mildred Loving were sound asleep in the bedroom of their Caroline County, Virginia, home in 1958 when police officers armed with blinding flashlights woke them up and arrested them. The problem? Richard was white and Mildred was black. The Lovings were charged with violating the ban on marriage for interracial couples. Bans on interracial marriage were still common in 1958—just a single generation ago. The Lovings pleaded guilty to a felony and faced up to five years in prison. Instead they got a one-year jail sentence, suspended on the condition that they leave the state and not return together for twenty-five years. The Lovings took up residence in Washington, D.C., and appealed their case. Nearly a decade after their arrest, the United States Supreme Court ruled that “racial hygiene” laws in Virginia and fifteen other states unconstitutionally sought to interfere with a person’s right to marry the partner of her or his choice.

    Many states claimed that laws against interracial marriage protected “the natural order of things.” But the Supreme Court declared that the “freedom to marry” belongs to all Americans as one of our vital personal rights, essential to the orderly pursuit of happiness by a free people. “The Fourteenth Amendment,” wrote the court in the Loving decision, “requires that the freedom of choice to marry not be restricted by invidious racial discriminations. Under our Constitution, the freedom to marry, or not marry, a person of another race resides with the individual and cannot be infringed by the State.”

    In the first half of the twentieth century, forty U.S. states forbade the marriage of a white person to a person of color. Many states enacted bans after 1912, when Representative Seaborn Roddenbery of Georgia introduced a constitutional amendment to ban interracial marriages. In his appeal to Congress, Roddenbery stated, “Intermarriage between whites and blacks is repulsive and averse to every sentiment of pure American spirit. It is abhorrent and repugnant. It is subversive to social peace. It is destructive of moral supremacy, and ultimately this slavery to black beasts will bring this nation to a fatal conflict.”

    By the 1940s, only two of the forty states with anti-miscegeny laws had repealed them. According to the religious doctrine underlying these prohibitions, marriages between whites and people of color were immoral and against God’s natural order. The trial judge in the Loving case justified his ruling—and his state’s ban on interracial marriages—with the sort of God-speak often invoked today against same-sex marriages: “Almighty God created the races white, black, yellow, Malay, and red, and he placed them on separate continents. And for the interference with his arrangement there would be no cause for such marriages. The fact that he separated the races shows that he did not intend for the races to mix.” Others claimed that allowing “interracial marriages” would corrupt the sanctity of marriage and dilute and weaken the institution overall. This all sounds eerily familiar.

    Meanwhile, the question of gay marriage has also existed since antiquity. In testimony during the Canadian court case that led to that country’s recognition of same-sex marriages in 2003, one historian pointed out that, although gay marriages did exist in ancient Rome, they were exceptional and not well regarded. What he didn’t mention was that when the Romans—who had no problem with homosexuality—argued against gay marriage, it was on the basis that no “real man” would ever willingly subordinate himself in the way required of a Roman wife.

    The same-sex marriage debate in the U.S. began edging its way into the political fray in 1991, when three gay couples from Hawaii sued that state for the right to legally marry. On May 5, 1993, the Hawaii Supreme Court issued a landmark ruling supporting the idea that it is discriminatory to deny gay men and lesbians the right to marry partners of their choice. Conservative response was swift in the form of the Federal Defense of Marriage Act, which passed overwhelmingly in both houses of Congress and was signed into law in 1996 by the lovable philanderer himself, President Bill Clinton. The act defines marriage as a legal union between one man and one woman, and says that states need not recognize same-sex marriages from other states.

    Defenders of traditional marriage say the Defense of Marriage Act is not enough. President Bush has backed efforts to amend the Constitution in defense against gay marriage, explaining,“There is no assurance that the Defense of Marriage Act will not, itself, be struck down by activist courts. In that event, every state would be forced to recognize any relationship that judges in Boston or officials in San Francisco choose to call a marriage.” The federal marriage amendment died in Congress last year, but last November, a newly named Federal Marriage Protection Amendment passed five to four in a subcommittee of the Judiciary Committee. If it survives the debating and voting of the full committee, it will proceed to the Senate for more of the same.

    “But, if and when a federal marriage amendment is ratified, marriage advocates may be surprised to discover that passing marriage protection laws may not be enough to save an institution in free-fall,” said Daniel Allott, a policy analyst for American Values, an organization dedicated to “uniting American people around the vision of our Founding Fathers.” Allott’s views appeared in a Houston Chronicle op-ed article on November 10, one day after the Federal Marriage Protection Amendment made it out of subcommittee. Two days earlier, Texas had become the nineteenth state to pass a constitutional amendment “preserving” marriage as between one man and one woman. The headline of Allott’s story asked, “Traditional Marriage Under Fire: Who’s Really to Blame?”

    That, according to Allott, would be me. He observed that despite a steady decline in marriage rates (nearly fifty percent over the past three decades, and twenty percent since 1995), “people have not given up living together.” Unmarried cohabitation has increased 1,200 percent since 1960, and “people are living as committed sexual partners in shared households without getting married.” These people, said Allott, are responsible for undermining support for traditional marriage. These people have damaged marriage enough, he said, to make room for debate about same-sex unions in the first place.

    “Clearly, the key players in the battle over marriage are not politicians, judges, or homosexual activists,” he wrote, “but rather the millions of heterosexual couples who have thumbed their noses at marriage and abandoned the institution. While same-sex nuptials would certainly trigger further marital demise, they are also a response to, and strong indication of, just how critically weakened the institution has become.”

    If, as some people say, the institution of marriage is practically dead, does it matter who is or is not allowed to partake in it? Massachusetts caused such a ruckus by granting legal recognition to same-sex marriages in 2004 that by the end of 2005, nineteen states had passed constitutional amendments against same-sex marriage. Vermont, on the other hand, has long sidestepped the issue by granting gay couples in civil unions all the legal rights of marriage except for the word “marriage.” Which brings me back to that sheet of recycled paper, and the question of whether or how much it changes anything. Is it the status of marriage—the legal and social benefits it confers—or the ritual of marriage that makes a difference, if in fact a difference exists? According to a website called ReligiousTolerance.org, there are more than one thousand rights, obligations, and privileges that the federal government automatically grants to all married couples. This surprises me. I haven’t, since my recent wedding, felt quite as showered by privileges as that statistic promises. I think Jon and I felt sanctified at our wedding—holy, special, privileged, protected—but we didn’t consciously consider whether marriage as a social institution was strong or weak. I doubt whether many betrothed couples, straight or gay, scrutinize their decision to marry in this light.

    None of these have been rhetorical or abstract questions for me. Jon and I were ambivalent about marriage, and comfortable with an alternative arrangement. We felt perfectly well accepted as a couple, married or not. Not surprising, according to British demographer Kathleen Kiernan. She theorizes that Europe and North America both are moving through a four-stage process that culminates with cohabitation being essentially equal in status to marriage. Sweden has reached stage four, with more babies born each year to cohabiting couples than to married ones, and with cohabiting parents no longer feeling compelled to marry even after the birth of a second child. The U.S. is thought to be in the beginning of stage three, where cohabitation is a socially acceptable alternative to marriage, but where most couples bearing children together eventually marry.

    So Allott and his entourage are right: People are shacking up like never before. In the U.S., they’re also living alone in greater numbers than ever, which is further testament to the changing patterns in how we live, and should probably warrant more concern than whether or not the people who are pairing up are gay, straight, married, or not. After all, consider Maslow’s hierarchy of needs. Love and companionship are third in line for urgency, just after the most basic elements needed for physiological survival and safety. Human beings may be driven, biologically, to procreate, but a drive isn’t the same as a need, and what’s needed for survival of the species doesn’t always mirror what’s needed for survival of the individual. In fact, procreation doesn’t even make it onto Maslow’s hierarchy of needs. Emotionally based relationships, on the other hand, are essential to human health and well-being. Love relationships take many forms, but marital love is arguably the most intimate and of the highest order.

    Jon and I could have taken legal steps to designate one another as next of kin, or we could have drawn up some other legal agreement to protect ourselves against the ravages of a future break-up. Yet those practical considerations weren’t really our main priorities when we talked about getting married. What we wanted was to participate in the tradition itself and to confirm our commitment in a universally recognized way. The wedding rite, in both civil and religious contexts, is, at its core, a celebration and a pact that hinges on a spoken promise in the presence of witnesses. For us, marriage represented a ritual and a state of being. We wanted to file taxes jointly, to be allowed to speak to each other’s account representatives on the telephone, and to be included as second drivers on rental car agreements without paying extra. We wanted to use the words “married” and “husband” and “wife” without the awkwardness and unease of feeling dishonest. Most of all, we wanted to make a promise to each other and, I suppose, to God, and to know that it was witnessed by others. And we wanted to be held fully accountable to this promise legally, socially, and spiritually.

    Sociologist Frank Furstenburg, speaking not of today’s extravagant wedding industry, but of the institution of marriage itself, has said, “It’s as if marriage has become a luxury item, available only to those with the means to bring it off. Living together or single-parenthood has become the budget way to start a family.” Plenty of people are going the “budget” route. A majority of couples now live together before marrying, and an increasing number of them have no plans to wed in the future. As for parenthood, more women than ever before consider single parenthood a viable route to motherhood in the absence of a suitable marriage partner, and one- third of all adoptions in the U.S. in 2001 were by single women. Statistics like these suggest that under certain circumstances, various alternatives to marriage carry less risk overall than does marriage itself.

    Meanwhile, the married household has lost serious ground as the normative model. In the 1950s, married couples made up eighty percent of all households, compared to fifty-one percent at the turn of the millennium. Many see marriage not as the rite of passage to adulthood that it once was, but as a stage of life that one should enter only after the hurdles to achieving stability—relationally and financially—have been overcome.

    I wonder how it affects people and their relationships to be denied the recognition of legal marriage. Yes, cohabitation has gained widespread social acceptance in the U.S. and elsewhere, but it does not fully parallel the benefits available through marriage. As historian Stephanie Coontz describes in her new book, Marriage, A History, “Arrangements other than marriage are still treated as makeshift or temporary, no matter how long they last. There is no consensus on what rules apply to these relationships. We don’t even know what to call them. The relationship between a cohabiting couple, whether heterosexual or same sex, is unacknowledged by law and may be ignored by friends and relatives of each partner. Marriage, in contrast, gives people a positive vocabulary and public image that set a high standard for the couple’s behavior and for the respect that outsiders ought to give their relationship.” True, many gay activists argue precisely the opposite point: They want no part of these retrograde social institutions, and view them as a form of selling out their movement.

    Catherine Newman, in her essay, “I Do. Not.,” from the anthology The Bitch in the House, cites the Defense of Marriage Act as one of the handful of reasons she herself has chosen to take a political stand against marriage. Instead, she chooses to cohabit with her longtime partner and father of her child: “Because I’d feel like a real A-hole if I put on a beaded cream bodice and vowed myself away in front of all our gay friends—smiling and polite in their dark silk shirts or gossiping wickedly about our choice of canapés—who cannot themselves marry.”

    I understand Newman’s position and commend it. But when I was twenty, I could not have taken the same stand. Eschewing or undermining marriage—my own or the institution—was the last thing I wanted to do.

    I came of age with the sorts of hearts-and-flowers ideas that send people’s eyes rolling back in their heads. I believed in destiny and soul mates and commitment and suffering for the greater good—and to a large extent, I still do, just with a lot more caution and humility. I certainly valued marriage as a sacred institution, and when I got married, it was going to be happy, healthy, and forever.

    But how does a social institution really affect a person’s daily life? How does it influence the decisions and internal struggles, the emotional reality, of one young woman on the cusp of her life as a wife and a mother? My attitudes, like most people’s, were rooted in personal history, which in my case involved my mother’s two divorces. I was too young to remember my dad leaving, so over time I integrated my sister’s mythologized memory: our dad’s legs and his shoes standing beside the marred yellow banister of our open staircase, his stiff suitcase, a pat on the head. Then he was gone. My future children would never possess such a scarring snapshot. For them, everything would be perfect. My childhood didn’t make me bitter, it made me something riskier: idealistic.

    Idealism led me to the altar at age twenty-one. Then, as soon as I descended the church steps, it began picking and tugging at my marriage. I could vaguely see this happening all along, as my real life very gradually unraveled beside the standard of perfection I measured it against. Some marriages withstand the stresses to which ours succumbed—youth, children born fast and many, and financial instability. God knows I wished to join their ranks. It wasn’t for lack of effort or love that my marriage failed—it was for lack of other necessary things, like knowing who I actually was. Barring that, a little forgiveness might have helped. I couldn’t forgive his mistakes, not because they hurt me (though they did) but because they so threatened my image of ideal marriage. Even less could I forgive my own, because back then such a compromise seemed akin to the death of idealism itself. Meanwhile, our mutual unmet needs stockpiled. On the eve of our twelfth anniversary, I lit the match and my ex-husband poured the gasoline. Then we both stood back to gape as the resulting inferno scorched and melted the contents of our shared life until the whole fiery thing collapsed on us and our children. Who can describe that kind of pain? Not me. I was frankly surprised to survive it.

    But I did. Now, I’m a “key player” in the battle over marriage. Along with everyone else I know, married or not, divorced or not. We are all participating in an unprecedented, massive cultural redefinition of marriage, simply by living in this time and place. Ironically, the expectations people have about marriage have never been higher. Thus the institution is both more fragile and more fulfilling than ever before.

    When I first got married in 1989, I did so smack in the middle of a thirty-year period in which marriage was undergoing more change than it had in the previous three thousand years. In Marriage, A History, Stephanie Coontz retraces the evolution of marriage from the beginnings of recorded history through today. According to Coontz, the divorce revolution of the sixties and seventies combined with a host of other factors (the decline of the traditional male-breadwinner marriage; new sexual mores; increased tolerance for out-of-wedlock births; and rising aspirations for self-fulfillment, to name a few) in the eighties and nineties “to create ‘the perfect storm’ in family life and marriage formation. And nothing in its path escaped unscathed.”

    These are not the conclusions Coontz—a respected and widely published family researcher—expected to draw when she began her scholarly research. As hinted at by the title of her first book, The Way We Never Were: American Families and the Nostalgia Trap, Coontz actually set to work on Marriage with the intention of debunking the idea that the institution was undergoing some sort of unprecedented crisis. “After all, for thousands of years people have been proclaiming a crisis in marriage and pointing backward to better days. The ancient Greeks complained bitterly about the declining morals of wives. The Romans bemoaned their high divorce rates, which they contrasted with an earlier era of family stability. The European settlers in America began lamenting the decline of the family and the disobedience of women and children almost as soon as they stepped off the boats . . . . Furthermore, many of the things that people think are unprecedented in family life today are not actually new. Almost every marital and sexual arrangement we have seen in recent years, however startling it may appear, has been tried somewhere before. There have been societies and times when nonmarrried sex and out-of-wedlock births were more common and widely accepted than they are today. Stepfamilies were much more common in the past, the result of high death rates and frequent remarriages. Even divorce rates have been higher in some regions and periods than they are in Europe and North America today. And same-sex marriage, though rare, has been sanctioned in some cultures under certain conditions.”

    Despite all this, Coontz’s research still took her by surprise. As she consulted with colleagues around the world, she gradually determined that the current rearrangements in both married and single life are in fact without historical precedent. But the seed for all this tumult wasn’t the oft-blamed sexual revolution, says Coontz. The trouble got started much, much earlier, in the late eighteenth century, in the form of an idea so radical it immediately began destabilizing marriage on a cultural and individual level: That people should be free to choose a marriage partner based, first and foremost, on love.

    Before love entered into it, marriage had been seen by societies around the globe as primarily a vital economic and political institution. Some cultures considered love a potential side effect to marriage, and others frowned on its presence in marriage altogether. But either way, it was deemed highly unacceptable for marriage “to be left entirely to the free choice of the two individuals involved, especially if they were going to base their decision on something as unreasoning and transitory as love.” If people went around marrying for love, they were going to demand to leave their marriages when love failed. The same notion that could make marriage such an extraordinary relationship could also render it optional and fragile.

    For thousands of years, the aim of marriage had been to establish beneficial kinship bonds and to pool or transfer resources for maximum economic and political advantage. Then suddenly, Europeans and Americans started expecting and even demanding emotional and sexual fulfillment from their marriages. Crises were bound to erupt.

    But this attitudinal shift alone, however cataclysmic, could not have brought us to where we are today. Coontz points to four key factors that made the difference: First, changes in the 1920s blurred boundaries between male and female spheres, and introduced the notion that sexual satisfaction was important for women as well as men. Second, urbanization increased anonymity and made it tougher to control individual behavior and punish nonconformity. Third, advances in birth control and abolishment of “illegitimacy” as a legal designation weakened the sway that pregnancy and childbearing held over marital choices. Finally, the legal autonomy and economic self-sufficiency achieved by women in the seventies and eighties opened up many alternatives to traditional marriage for both sexes.

    In a breathtakingly short time, society’s ability to push people into marriage or keep them there disintegrated. Writes Coontz: “People no longer needed to marry in order to construct successful lives or long-lasting sexual relationships. With that, thousands of years of tradition came to an end.”

    I’m really happy to be married to Jon. Over the years, we’ve built a relationship that strengthens us both, and our new marriage does feel like a sort of shroud of protection. I’m not sure if we could have sustained a marriage had we not spent so much time preparing for it. With our nuptials only a few months old, it’s a little soon to be making proclamations about our marriage’s longevity. I definitely don’t think we would have sustained a marriage to each other in our twenties, just as we weren’t able to sustain our marriages to our first partners. For lots of complicated reasons, we are both people who needed the buffering of time and experience to gain the self-knowledge and skills that marriage requires.

    I think, knowing all that I now do, that I would have felt heartbroken to be denied all this by those who might decree, as Daniel Allott does, that marriage is undermined by people who divorce and cohabit. I’m not saying that Allott is entirely wrong. In fact, he’s not. Marriage as a required construct of modern social life is undermined by those who divorce and cohabit. But marriage as a free and conscious choice is not. Unlike Allott, I no longer assume that marriage is required in modern social life. Love’s inclusion in the equation has complicated matters and weakened marriage as an institution, but it has also elevated the potential of marriage to be something it never was before—a path to fulfillment and spiritual growth.

    At our wedding, Jon’s nineteen-year-old daughter sang with two of our friends. It was a gorgeous Iron and Wine tune, sweet and melancholy, with lyrics full of love and awe: One of us will die inside these arms/Eyes wide open/Naked as we came. I cried as Britta sang, because it hasn’t always been easy and yet there she was, there we were, Jon and me.

  • Boy Trouble

    My five-year-old son, Peter, is standing in the middle of the practice rink at Parade Ice Garden in Minneapolis. The other children, coached for this moment by their parents, can push off their skate edges in a wobbly glide. Peter hasn’t made the connection that skating is an entirely different motion from walking. He marches across the ice, arms akimbo, his blades tick, tick, ticking where they should carve and slide.

    My stomach turns queasy. I feel as though my husband, Walter, and I have sent Peter off to kindergarten without teaching him how to spell his name. A tiny boy flails into Peter and knocks him down.

    “He checked him,” I yell, grabbing Walter’s arm.

    Walter winces, too, but because his role in our marriage is to play calm to my storm, he takes a more benign point of view. “I think the kid just didn’t know how to stop,” he says.

    “But Peter can’t skate,” I say. “This is torture.”

    “It’s torture for you,” Walter answers. “But he’s hanging in there. At least he’s not crying.”

    A hockey player from a family of hockey players, Walter is confident that, given time, Peter will catch on. I share neither his confidence nor his enthusiasm. As I watch Peter struggle to keep his balance, I think back on a dinner party we went to when I was pregnant. I was in a crisis because the baby growing inside me wasn’t the girl I’d always envisioned. I think that the shock—or was it denial?—that I was having a boy was what made me announce that no son of mine would ever play hockey.

    The hosts, parents of two girls and two boys, rolled their eyes. “Why not?” the wife asked. “It’s a meathead sport,” I answered. “And I don’t want to raise that kind of boy.” That my husband is anything but a meathead was beside the point, I explained. Then I launched into a diatribe about how the macho locker-room culture was what made Walter decide to quit after tenth grade. Hockey, as I saw it, was aggressive and overly competitive. It developed the kind of brutish instincts that I didn’t think should be encouraged in boys.

    The wife had known me since junior high and reminded me that we used to play soft-puck hockey in our figure skates. “Don’t you remember how much fun that was?” she asked. “Well,” I answered, “if I have a daughter and she wants to play, I think I’d be okay with that.” As I saw it, harnessing aggression could be empowering for girls. But for boys it was the beginning of a trajectory that inevitably ended in violence. My friend retreated to the kitchen to get the dessert and her husband reached for the wine.

    Pregnant with her first baby in 1955, Adrienne Rich, as she would later write, “set my heart on a son.” Her reasons ironically stemmed from a desire for self-identification. “I wanted to give birth, at twenty-five, to my unborn self,” she explained in Of Woman Born, “the self that our father-centered family had suppressed in me, someone independent, actively willing, original—those possibilities I had felt in myself in flashes as a young student and writer, and from which, during pregnancy, I was to close myself off. If I wanted to give birth to myself as a male, it was because males seemed to inherit those qualities by right of gender.”

    Thirty years after those words were published, few Americans need to have boys to harvest the crops or take over the family law firm. Those women who have benefited most from feminism’s advances into the mainstream––namely the educated, career-oriented American women who populate my slice of the world––don’t need sons to live out our unrealized dreams. Why would we? Daughters, who are a closer approximation to us, can do it instead.

    I understood when I got pregnant that it was possible Walter and I could conceive a boy. But I didn’t believe it. Part of my blind spot, I’m sure, stemmed from experience. I was the oldest of four sisters and went to an all-girls school until I was in fifth grade. None of my childhood friends were boys. Girlhood was the only world I knew.

    My education encouraged this viewpoint. In 1986, while attending Barnard College, I was handpicked by my favorite English professor to join a campus feminist literary magazine called Eve’s Rib. When a Columbia student, whom I’ll call Josh, asked to join our cozy, all-woman collective, we debated the consequences of opening up the membership. Josh told us that he had been deeply influenced by the feminist theory he had learned in his courses and that he wanted men to also benefit from the movement’s insistence that both sexes are wronged by the patriarchy. We wondered aloud if his earnestness was really a ploy to sleep with one of us, but agreed that we shouldn’t discriminate against him because he happened to be a guy.

    Josh’s first job was to design a cover using an inkblot illustration that we felt was both abstract and sophisticated enough for our endeavor. His finished product, however, was not what we had hoped for. Josh had, we decided, phallocentrically set all the type thrusting into the inkblot’s blank spaces. “This is what happens when you let the patriarchy in,” one of my colleagues announced. Josh had to go.

    If I tell this story with the same bemusement I use to recount my children’s learning experiences, I want to make it clear that I’m not recanting my feminist education. Feminism, as it was taught in the 1980s, wasn’t a share-your-feelings detour away from rigorous thinking. Rather, it was a disciplined, intellectually engaging philosophy that showed young women like me how to critique the world around us. It taught me not only how to challenge male privilege—no inconsequential realization when you consider that twenty years later men still earn more than women and hold the overwhelming majority of government positions—but to question my own lofty position as a rich white kid.

    What feminism didn’t teach me was that mother love messes with ideology. Today when I think about Josh, I wonder not only about what all of us in the collective missed by not trying to work out a solution with him, but also whether Josh talked to his mother about his expulsion from our garden. Did she see it as a kind of historical corrective and valuable learning experience? Or did she imagine her almost-grown son as a vulnerable boy standing in the middle of a skating rink?

    When I got pregnant, most of my friends had daughters. I had watched them raise their girls, my mental notebook filling with ideas for when my time came. I was enchanted by the attention they paid to encouraging their girls’ uniqueness. One wrote books about body image and classroom self-esteem for junior high school girls, and she and her husband put her research into practice at home with their two daughters. Their living room was stacked with library books portraying spunky female characters like Pippi Longstocking and Reckless Ruby, an aspiring firefighter who cringes when her mother calls her “precious.” My friend bought (or sewed) only clothes the girls could stain or rip when jumping off the highest rung of the monkey bars or calibrating the liquid to dirt ratio of a mud hole. When the older daughter got her period, the entire family celebrated with an all-red dinner: spaghetti with tomato sauce and heart-shaped meatballs, sliced bell peppers, and cran-raspberry juice.

    This was how I imagined I’d raise my daughters, too. I assumed mothering a girl would come naturally to me because she would possess more of the qualities that I valued: She would be less physical, more creative, more connected to her parents when she became an adult.

    At our week nineteen ultrasound, the doctor pointed a pen at a skinny triangle sticking out from between the baby’s legs. “So you tell me,” he said. “What do you think we have here?”

    I stared at the screen. At first I couldn’t make out anything, but I forced my eyes to focus on what looked to me like a crescent moon with a blob in the middle. The doctor explained it was the baby’s butt and the bottom sides of the thighs. The moon wiggled. I squinted to get a closer focus. And then I saw it. That triangle, that blob. It was the baby’s labia.

    I couldn’t believe it was so obvious.

    “A girl?” I whispered.

    Walter raised his eyebrows at the doctor. Then he put his hand on my shoulder. “Honey,” he said. “That’s a penis.”

    That I was disappointed by the news was deeply troubling. Our baby was, after all, healthy. But what was equally disturbing—and perhaps more surprising—was the tepid response I got from a lot of other women. “Well, at least you’ll raise him to be a good man,” they said. “We need more women like you to have boys.”

    Before I got pregnant, I would have agreed with them. Before I got pregnant, I also said this to friends who were expecting boys. The promise of feminism as I learned it (but obviously wasn’t ready to put into play when it came to Josh) lay in the liberation of both sexes from rigid gender roles. Women gain access to the upper reaches of institutional power and change them to not be so hierarchical, and men are finally allowed to enjoy an emotional world our culture previously denied them. A feminist education tells boys they can be vulnerable, that they are worth more than the thickness of their wallets. A feminist education instructs boys and men that the search for an authentic self is a birthright. It challenges the callous assumption that men’s lives can be disposed of for national defense.

    All of these convictions didn’t matter in the face of my ignorance and fear of boys. I wondered how it could be that I—a woman with massive ambivalence about her child’s gender—could be the right kind of woman to have a son. To me, boys were not individuals, but rather a gang of sports-crazy thugs. If they were not part of a pack, they were anti-social future engineers who shunned sunlight in favor of an afternoon fiddling with an erector set or working a computer keyboard. I made exceptions, of course: My husband, my brother-in-law, all the men I was lucky enough to call friends. But these were men. With the exception of my adored nephew, who was less than one year old, I didn’t know any boys intimately well.
    If a man with a similarly negative view of girlhood was to have a daughter, I know what my friends and I would have called it: A tragedy. Why then, was it good that I might (might) influence my son to be less “like a man”? Was it because I had noble opinions about what could, in an ideal world, make a man? Or because I wanted to raise a man who would have enlightened ideas about women? If the job of a parent is to encourage a child to grow into their own unique self, it was clear that I was in trouble.

    Worried that I would be a complete failure as a mother, I called my only close friend who had a son and tearfully admitted that I was terrified of having a boy. “Loving a child has nothing to do with gender,” she assured me. “This little boy will steal your heart.”

    My friend was right, of course. Now that I am the mother of two boys—our younger, Henrik, is two years old—I’m so steeped in the marvelous complexities of boyhood that it’s almost impossible for me to relate to that pregnant woman sobbing into the phone. Like all parents, I believe that my boys—as well as the daughter my husband and I are adopting—are the children I was meant to have. But more than that, I believe that my sons were the children I needed to have.

    In Sexual Personae, Camille Paglia writes, “When I cross the George Washington Bridge or any of America’s great bridges, I think—men have done this. Construction is a sublime male poetry.” Ten years ago, I would have tried to dismantle her praise by pointing out that at the time the George Washington Bridge was built, people didn’t believe women could physically handle the rigors of manual labor—not to mention engineering. Now, I am moved by the fact of the accomplishment and Paglia’s determination to claim it as such. I still hope that today’s bridge builders and designers are both men and women and that my sons can grow up to choose to be stay-at-home dads, if that’s what they want (or, more important in today’s realities, can afford).

    Because of my sons, I look at people as individuals rather than representations. Boys are not monsters. Nor are girls naturally kinder and gentler. That’s not to say that even this more nuanced idealism goes untested. In my short parenting career, I have steered Peter away from a Southern belle Halloween costume and talked him out of a pair of pink-feathered mules. Sure, I bought him a purse and the Barbie he begged for, but if I’m honest with myself I’ll admit that it was because I knew he wouldn’t take them out of the house. Or at least not beyond the backyard, which is where I last spotted the doll, naked and face down in a clump of damp leaves. The fact that I will gladly let my daughter parade around as Spiderman or cart a convoy of trucks to the playground strikes me as one of the greater inconsistencies in my philosophy. Partly, this censoring comes from an understanding that the world we live in is still suspicious of and frightened by boys who walk outside prescribed guy boundaries, and I don’t want my sons to be emotionally taunted or physically harmed. But I worry that by going along with these rules, I’m reinforcing the notion that they can’t express everything they are in public. By indulging my protective instinct, I’m foregoing efforts to make America, or at least my corner of it, a better place for boys of any stripe.

    At the same time, I’ve told Peter that he isn’t allowed to wear camouflage and that Santa doesn’t make plastic rifles. When Peter got to check out his first books from the school library, his choices were The Navy: At War and The Air Force: At War. While I wished that he had chosen the sled dog picture book that the girl in our car pool clasped in her hands, I knew this was more about his newfound ability to figure out what he thinks without his parents’ stage management. So we went home and read both books cover to cover. Two weeks later he told me: “I don’t like killing and I am for peace. But I still think looking at guns and weapons is interesting.” The next book he checked out was about skeletons and skin.

    Today my home life is a blur of crayons, clay, and blocks, not to mention a fleet of construction vehicles and toilet paper rolls turned into weapons—often pointed in my direction. I’ve given up trying to stop the shooting, mostly because I’m too overwhelmed to spend my entire day playing peace cop—and I figure that if they are happily engaged I’ll get a chance to read the paper—but also because I trust that they won’t grow into men who believe that killing people is a sport. I hope to have accomplished at least this much as a mother. If I didn’t have sons, all the paper cannon-building would horrify me. But because I understand the tender hearts of both these boys in particular, I feel that it’s healthier for them to work out their aggression than to suppress it. (Who knows, maybe the mothers of playground bullies think the same way.)

    That doesn’t mean that I’ve reconciled myself to the militaristic expectations our culture has for its boys. In 1976, in the shadow of the Vietnam War, Adrienne Rich wrote that under patriarchy, all mothers of sons worked for the army. Today, even with American forces stretched to their limits in Iraq, there is no draft. But the deeper truth of Rich’s statement still stings. Whenever I go to Target, I’m struck by how somber and boring the boys’ cargo pants and striped T-shirts are compared to the glittery oranges and pinks and purples across the aisle. And don’t get me started on the rows and rows of fighter jets (which my sons adore) and Legos that turn into endless variations on the grim reaper. Even though I find these norms dispiriting, I dress my sons in the standard American boy uniform and cut their hair short. And when the top item on Peter’s Christmas list was an aircraft carrier, I decided not to be too hard-core about my conscientious objections.

    The other day I overheard Peter humming a tune from Mary Poppins. “We’re clearly soldiers in petticoats/And dauntless crusaders for women’s votes/Though we adore men individually/We agree that as a group they’re rather stupid.” I could tell by the way he was mumbling the words that it was the tune that mattered to him and that he hadn’t processed the song’s meaning. But listening to him did make me wonder how I’ll talk to him about the personal revolution his birth set in motion for me. While I hope that what my boys feel most powerfully is my unyielding love for them, I suspect that one day we will all have to reckon with the complicated emotions I brought into our family-making.

    When Walter and I decided to have a third child, we agreed to adopt. Part of the reason was because we wanted to choose our baby’s gender, to parent a girl. Though I deeply love my boys, I still wish to put all those girl power philosophies into play. The desire is like a phantom limb. I wonder, though, about the crown of expectations I’ve made for my daughter. I’ll have to remember that she’s not a mini me. I’ll have to be supportive and bite my tongue, even if she wants to wear pink dresses and bake pies for the rest of her life.

    Two weeks after that wretched first hockey practice, Peter learned that by pushing off the edge of his blade, he could travel farther than if he just walked across the ice. To glide and keep his balance was a thrill for him. Not to mention a revelation for me. A hockey player, I suddenly realized, must not only master the difficulties of skating, but combine that skill with quick reflexes and hand-eye coordination. In the glow of Peter’s newfound enthusiasm, nothing about hockey—except perhaps those infamous 5:30 a.m. ice times—repulsed me. When Peter scored a goal during a scrimmage, I was so excited that I called my mother from the rink.

    Last week during practice, I wandered over to the vending machines near the main rink where members of a junior high team were being put through their drills. It wasn’t until I got closer that I was able to make out the ponytails falling down their padded backs. Before Peter was born, this would have been the only game at Parade Ice Garden that I cared about. Today it’s the icing on the cake. For as much as I would love to stay and cheer these girls on, I leave. My son is learning how to stickhandle. And I want to be there to see it.