Author: Hans Eisenbeis

  • Scooper & Scooped: Local Edition

    We don’t normally pay that much attention to the local daily
    news. Not in a professional way—it’s too much work for too little
    reward, and we’re constantly annoyed at how the paper has become more
    about pictures and graphics than about actual news stories.

    But
    certain broad cultural trends had us interested in seeing the newsroom
    flick “All The President’s Men” the other day, and it was fun to see
    Dustin Hoffman and Robert Redford on the phone so much. Newsrooms, from
    what we hear, are intensly competitive places. If you work at the
    Washington Post, you first read your own paper to see who among your
    collegues have been favored by the makeup editors, and you keep a daily
    calculation of a wide variety of grudges and jealousies. Next, you read
    the New York Times, for a broader, more ecumenical kind of
    self-loathing and professional jealousy. And if you get scooped by the
    New York Times, you go to the bathroom and splash cold water on your
    face and you curse loudly, and you wonder if you’re in the wrong
    business.

    We couldn’t help noticing in all the national hype
    about the alleged Chai Vang murders—Fox News! LA Times!—that BOTH of
    our hometown papers really got scooped in the embarrassing way. Yesterday,
    a New York reporter at the Times published a story that had a number of
    local Hmong sources saying Chai Vang was, in fact, a shaman
    in his
    St. Paul community—a widely respected religious leader among his people
    who on more than one occasion has performed intense religious rites to
    exorcise evil demons from those who require such services.

    It
    became painfully clear that no one at either Twin Cities newspaper had
    actually picked up the phone and talked to anyone in Vang’s extensive
    circle of friends, relatives, and acquaintances. Judging from Stephen
    Kinzer’s story in the Times, it was the worst kept secret in the St.
    Paul Hmong community. So how did the Star Tribune and the Pioneer Press
    manage to not overhear this bizarre and interesting news?

    What’s
    even more interesting to a layperson like ourselves is that neither
    paper has, at this point, acknowledged that contribution to our
    understanding of who this controversial figure is. (Today’s Star
    Tribune has the groundbreaking scoop
    that Vang had a warrant out for his arrest on previous trespassing
    charges. Yawn. And Todd Nelson, at the Pioneer Press, does talk to
    friends and relatives, and writes a nice profile of Vang—but
    this is basically what you’d call a rear-of-the parade followup story
    to the Times which does not acknowledge whose shit it was that the
    Pi-Press was shoveling a day late.)

    Also, it is not uncommon
    for the Star Tribune or the Pioneer Press to reprint stories from the
    New York Times—but they’re not doing that with this story. Why?
    Probably because it would make both papers look pretty stupid to have a
    local story reported better from some desk in Manhattan.

    Like we
    say, we’re just casual observers. We’re not in the news business per
    se, so we don’t wish to cast aspersions. We will, though, toss the inky
    wretches a freebie here: If you read to the end of the Times piece, you
    might notice that a person named Noah Vang was credited with local
    reporting from St. Paul. Is this the same Noah Vang who was indicted on
    murder charges last year, in a Hmong after-bar knife incident?—The
    Editor in Cheese

  • Eternal Recurrence: Tom Wolfe edition

    In our ongoing coverage of Tom Wolfe’s new book, we mentioned
    yesterday that we enjoyed Jacob Weisberg’s review in last Sunday’s New
    York Times Book Review. What distinguishes good criticism from great
    criticism? We’re glad you asked. A couple of things, actually.

    First,
    we prefer critics to resist the urge to pronounce a simple verdict.
    There are great pressures in the “marketplace” of modern media to give
    everything a thumbs up or a thumbs down. That has more or less
    guaranteed that most critics are all thumbs. They approach every review
    with the idea that they have to make an argument either for or against
    it; they begin to marshall their evidence and write their punchlines.
    The problem with this approach is that it doesn’t often give the reader
    or the subject a fair shake. There are not very many flawless
    masterpieces being produced these days—in fact, ever. (That’s kind of
    inherent in the definition of “masterpiece.”) There IS a lot of crap,
    but you can usually find something redeeming about most of it. The
    point is, there is a kind of intellectual dishonesty about reducing
    everything to an unqualified yes or no.

    Second, there are way
    too many critical reviews and they are all way too short. This is
    related to the first point—marketplace pressures to cover as many
    artifacts and events as possible, and to do it decisively, if not very
    thoughtfully. Thus our “blurb” culture. Can you find an example of a
    magazine or newspaper that DOESN’T have, as a part of its regular
    offerings, dozens of instantly forgettable reviews of CDs, books, and
    movies? (We can think of one. If you think of the same one, or another
    that fits the bill, we’ll send you a Rake T-shirt. Send your answer here. First responder wins). It is not necessary for a good critical review to be long, but it helps.

    Third.
    This is the most difficult quality to explain and to achieve, but it is
    what makes a really good piece of criticism something we tear out of a
    magazine and carry around in our breast pocket: the ring of truth. The
    beauty of a really good review by someone like Anthony Lane—or Peter
    Shjelldahl, or Jacob Weisberg, or Chuck Klosterman—is that you know,
    without reading the book, or seeing the film, or listening to the CD,
    that the critic hit the nail on the head.

    Now, we think
    Weisberg hit several homeruns in his piece. He comments that Wolfe’s
    descriptions of the modern campus are “excrutiatingly” detailed, but
    Wolfe—being a journalist rather than a true novelist—writes like a
    reporter. There are no meaningful descriptions of peoples’ motives,
    only their actions and their appearances. (This is an editor’s constant
    struggle, by the way, and it is what distinguishes a newspaper reporter
    from a magazine writer. Reporters are very uncomfortable with subtle
    description and analysis. If they can’t find a source to say it, and
    another to confirm it, then they can’t write it. Writers have the
    opposite problem—finding an authority greater than themselves.)
    Weisberg also gets it just about exactly right when he says that
    Wolfe’s peculiar magic is his ability to create page-turners; it’s
    almost impossible to put Wolfe down, even when he’s at his worst.
    Finally, the clencher: Who ever re-reads a Wolfe novel? No one. Running
    our own mental check, we find that the only Wolfe book we’ve ever
    reread was “The Electric Kool-Aid Acid Test,” and that, of course, is
    not a novel; it is a work of non-fiction.

    Compare “I Am
    Charlotte Simmons” to, say, Jonathan Franzen’s “The Corrections,” or
    Jonathan Lethem’s “Fortress of Solitude.” Both of those books have
    short sections which describe life on the modern college campus— but
    they are both better books, because they trade in interior, essential
    truths rather than surface appearances and incidents. We’ve been
    planning to reread both of those wonderful books from the moment we
    finished them the first time.—The Editor in Cheese

  • Panderlust

    In yesterday’s Sunday Times, Frank Rich makes a point we were trying to make ourselves a few weeks ago. In the aftermath of the election, USA Today had published a story that suggested Big, Bad, Liberal Media was scratching its collective head, wondering where it had gone so terribly wrong in understanding the country–and more to the point underestimating the electoral muscle of the anti-intellectual, conservative, white male, NASCAR masses. In fact, even Frank Rich’s boss, Bill Keller, the executive editor of The Times, was described in that article as being somewhat flummoxed–so flummoxed in fact, that the best idea he could come up with was to reopen the Times shuttered Kansas City bureau.

    Yesterday, Rich looked at the problem as it applies to network TV news, what with the recent retirements of Rather and Brokaw, and the ascendency of Brian Williams. He suggests that network news is desperate to win the hearts of red America, so desperate that they are making a point of decamping to Toledo and Dubuque and Denver. NBC news is going to great lengths to establish the bona fides of Williams–hey, he’s a part-owner of a go-cart track! He drinks Budweiser! He showers AFTER work. (Well, no maybe not that. But hey, he’s got a mitten loofa too, just like O’Reilly! Wait, that’s kinda faggy and liberal, innit?) Why would they do that? Is it because they seriously believe there is news happening out there that they are ignoring because of their bi-coastal myopia? What Rich said better than we could ever hope to say was this: They are chasing an audience, not a news story. And that is a real sign of declension, and a cause for worry.

    Salient, fact-checking moment: Why chase after Fox News viewers who are rabidly partisan and reality-challenged, and in any case, are far outnumbered by network viewers? The problem is perceptions and myths. As Louis Menand makes very clear in his wonderful story in last week’s New Yorker, the already unassailable “take-away” from election 2004 was the “values fallout.” There was no values fallout. Menand points out that this was strictly a misreading of exit poll numbers with no clear consensus on why people voted in any particular way. (This is probably, like everything else, the fault of Democrats. Republicans could care less why they won–the less said about that the better, as far as they’re concerned.)

    The key to this little conundrum is the very real frustration that great media organizations like the Times and the New Yorker and almost any other thoughtful organ of print journalism are feeling. You can print the facts, the truth, the most compelling sorts of historiography–but you can’t make that horse drink the water.

    We had the same sinking feeling after reading Rich’s essay that we had reading all those terrific pre-election presidential endorsements–that there isn’t one person in the country who’d read it and have his mind changed. In these fractious times, even the Times is preaching to a choir. One can certainly forgive them for trying to either expand the choir a bit, or take their show on the road. (Incidentally,interesting article today covering the same territory with NPR, but with a racial facet; Tavis Smiley wonders how to get more blacks to listen to public radio. How is this different from trying to get more conservatives to read the New York Times? Discuss…)

    To have a small but vocal crowd of knownothings grow into a hateful GOP monopoly of government that has, in no small way, been underwritten by a deliberate campaign of falsifying reality and pre-emptive accusations of “liberal bias”– this has diminished the power of the entire industry of journalism. Facts are not partisan, but many people don’t seem to believe that anymore. We guess you just feel the pinch more at the top, where you’re accustomed to the respect afforded the “paper of record.” When it develops that the news is not the news, but an exercise in servicing an audience, you get– well, modern TV news.

  • Scooper & Scooped: Poached Edition

    We were surprised to open up Monday’s Minneapolis Star-Tribune to see Jon Tevlin’s article on religion in the workplace. Surprised, because it was very similar to a feature story that was on the cover of The New York Times Magazine about a month ago. We’d noticed Russell Shorto’s feature, not only because it was a compelling cover story, but because its main subject was a small bank in outstate Minnesota. Also because the photographs, taken by white-hot Minneapolis photographer Alec Soth, were wonderful.

    We’ve already commented recently on the phenomenon of follow-on news stories: The New York Times or the Washington Post do the heavy-lifting on a story, get all the glory for the scoop, and when the parade has passed, all the local papers shuffle along shoveling up the remainders, maybe a little ashamed that someone in Manhattan managed to break a local story under the noses of a whole newsroom full of local reporters.

    Tevlin does acknowledge the source of his interest in Riverview Bank, after a fashion. Near the end of his piece, he notes that Riverview Bank, on its website, claims to have converted Times "freelancer" Shorto during an "interview for a newspaper article." (Shorto denies this.) When we emailed Tevlin about his follow-on article, he told us there were lots of other interesting loose ends to tie up in the Riverview Bank story, and he was onto them the day after the Times article appeared. The St. Paul Pioneer-Press, in the person of business reporter Dave Beal, was also on the story. They published their own follow-on November 11.

    There is nothing wrong with this practice per se. While we don’t want to inflame professional jealousies, it would be nice if writers acknowledged where they get their story ideas, particularly if it’s from other writers. It is merely vanity that prevents someone from writing "as first reported in the New York Times." But this sort of story poaching goes on all the time; local daily newspapers are especially bad about doing it to nationals, weeklies, and monthlies. They have done it to us here at The Rake. (We’ve already given up hope of ever working elsewhere in this town. Funny how if you write about media in New York, you’re guaranteed a job practically for the rest of your life. If you write about media in the Twin Cities, you’d better keep Monster.com bookmarked.) For our own part, we admit to being allergic to a story if it has appeared anywhere else our esteemed readers may have been exposed to it. This falls under the principle of giving your readers a little credit. And, as we love to point out, a newspaper article and a magazine story are two very different animals. Tevlin’s story was different from Shorto’s, though it was clearly provoked by it.

    Still, we were surprised that the Star-Tribune photographs were so similar to Alec Soth’s. One Strib image depicted the exact scene as the shot on the New York Times Magazine’s cover: An office wall with a handsome painting that shows one modern businessman introducing another businessman to the robed and haloed Jesus Christ, as if to say,"I’d like you to meet my boss, the Son of God."

    The striking similarity in the photographs seemed a breach. Were we being naive? We can see how you might make the argument that, just as Riverview Bank is sitting out there in the public domain for anyone to write about, their office interiors and personnel are not themselves copyrighted. And given that Tevlin’s lead specifically refers to this painting, it falls under the definition of pure documentary photography, right?

    We don’t know. It doesn’t seem possible that Stormi Greener, an excellent photographer in her own right, was unaware of Soth’s photos when she shot hers for the Star-Tribune. To our eye, it seems obvious that someone asked her to take precisely the same pictures Soth had taken for the Times magazine— photos that are undoubtedly under license and embargo, and not therefore available to the Star Tribune or anyone else. You look and see what you think: Here is Soth’s photo for the Times, and here is Greener’s.

    We got ahold of Alec Soth in Paris, and he was a little surprised. "Wow, that is quite similar," he said. But he was willing to believe that it was a coincidence—and that probably an editor at the Star-Tribune should fall on the sword for this. (We know from experience: It is ALWAYS an editor’s fault!) Jon Tevlin told us he thought you could send dozens of photographers to Riverview Bank and they’d have taken the exact same photo. The Jesus-in-the-executive-suite artwork is a "no brainer," he said. Times magazine editor Gerald Marzorarti politely declined to comment, and Greener has not answered a call and an email.

    This photographic facet of the follow-on story undoubtedly falls into a grey area, and maybe it illustrates the difference between fine art photography and photojournalism. Soth’s photo is striking in part because it is so artful, whereas Greener’s has a solid if unremarkable gravity as photojournalism—and it’s almost the same picture!

    But it’s the art within the art. When we first saw the cover of the Times Magazine, we were convinced that a Times art director had pulled off an amazing illustration. Indeed, the point of both the Soth and the Greener photos was actually to reproduce the astonishing piece of framed, evangelical art, in situ. Perhaps the real injured party here is Nathan Greene. He is the formerly anonymous born-again capitalist who was responsible for painting "The Senior Partner." He’ll undoubtedly get his reward—and maybe his copyright—in the next world.

  • Scooper & Scooped: Red America Edition

    Yesterday, the people who organize the Gay Pride parade in the Twin Cities filed a complaint against the Star Tribune. This one could sting: They are complaining to the Minnesota Commission on Civil Rights because the Strib apparently refused to publish an advertisement for the parade that showed two men kissing.

    Many interested readers who pay attention to the subtleties have been piqued by the Strib in recent years—in fact, ever since Keith Moyers took over the paper. There have been some real brow-raising moments, particularly on the publishing side of the paper. Last summer, for example, there was a widespread rumor that high-ranking ad executives were avid followers of Luis Palau. Thus had his ballyhooed “Twin Cities Festival” not only got sweetheart status in the sales department, but the edit department had also bowed to the will of the Lord and published numerous odd features that could only be called fawning.

    Then this fall, the paper refused to publish an advertisement that had no images at all— in fact, it was a piece of poster art depicting a bunch of numbers. It was a mathematical compendium of the lives and limbs lost so far in Iraq. (An advertisement we subsequently published in The Rake, incidentally. Of all the consipiracy theories, we like the one that suggests the Strib has something against simple math. It certainly seems to be catching in the newspaper industry.)

    So what the hell is going on with the Newspaper of the Twin Cities? We doubt whether there’s truly an emergent Christian fundamentalist impulse taking over down on Portland Avenue. Like most of these things, the real story is found neither on the front page nor the ad pages nor even in the op-ed pages, but in the McClatchy spread sheets.

    It is just barely possible that the Strib is hoping to outflank the Pioneer Press’s alleged play to the right (going for all those wacky Woodbury readers with backyard bomb-shelters, you know). What is more likely is that the business is simply responding to a certain neap tide of community sentiment. While the city’s liberal core has been just as loud and outraged as ever, the Christian right has—as they say—been emboldened by what we in the Big Bad Media have made of them in the past 90 days.

    We don’t hear about it so much here on the far-left side of downtown—but then we’ve always tried to respect community standards in a kind of surgical way. (A smarter culture war!) But over there on the right side of Minneapolis, we imagine the Strib has seen a real spike in envelopes bearing a return address from the Holy Name Society. The Strib is undoubtedly the bellwether for this type of critical mass. On the opposite end of the publishing spectrum, we understand there are similar pressures. We hear through the grapevine that the Minneapolis office of the Onion is no longer accepting display ads for sexual services—in other words, pictures of boys kissing boys—and it sure as hell ain’t because they suddenly got Jesus.

    One word, people: Circulation.

  • It’s My Country, I’ll Cry if I Want To

    For many years, country music was one of the shibboleths of the alternative nation: If you were born after 1950, lived in a city, and considered yourself smart and hip, you’d say you liked “all kinds of music”—pause—“except country.” There were almost as many “Country and Western” jokes as knock-knock jokes. And the folk revolution of the late sixties was remarkably irrelevant to mainstream country. (Vietnam, the great divider of that generation, pitted cowboys squarely against hippies.)

    Like the new Imax film Our Country, the whole genre has too often been a self-parody. That makes it hard to take seriously, and it’s a shame. If you look no further than the far right of the FM dial, there are lots of reasons to hate country. The great decline really started in the seventies and early eighties, culminating in gone-to-seed dudes like Mac Davis, Conway Twitty, George Jones, and Glen Campbell. At the time, it was the men who were the derelicts of country music, not the women. (God bless you, Dolly, Tammy, Loretta, and Emmylou.) The same reasons to loathe country music persist today in the saccharine pop of straw-stuffed FM stars like Shania Twain, Toby Keith, and Garth Brooks. (I’ve found a simple formula to distinguish the good from the bad: If it sounds like a commercial for Ford or Budweiser, it probably will be one before long. This is bad.)

    We might have dismissed mainstream country the same way we’ve dismissed classical music. In the nineties, though, something funny started to happen. Young urbanites, especially those who’d been steeped in punk rock, were forever on the lookout for novelty. Moving forward often requires looking backward; some musicians began to study older forms of folk music. Eventually, they got so far as to punkify blues (Jon Spencer, not to mention his less-deserving copycats, the White Stripes) and jazz (Medeski, Martin, & Wood and protégés like the Bad Plus and Happy Apple).

    And somewhere along the line, an earnest new generation of musicians got sucked into one of America’s great and durable traditions: electrified folk, otherwise known as true blue country music. A band like the Jayhawks helped launch alt-country, with garrisons in Wilco, Joe Henry, Steve Earle, Lucinda Williams, and many others. For the urbane and curious, these artists opened the doors to historical country music, and, frankly, made it cool again. I think it’s fair to say that the alternative-country gang, no matter what their pretensions, helped to reclaim American country music as it was played up until about 1970—that is, before a whole lot of coke and sequins got snorted off the coffee tables of huge record companies in Nashville. Needless to say, the Jayhawks’ Gary Louris probably has more in common with Hank Williams than Keith Urban does.

    Then again, so what? That rare insight may scratch a certain kind of elitist itch, but it doesn’t much explain country music today—not the brand most Americans would recognize, anyway. Which brings us to people like George Strait and Tim McGraw. Modern fans of Hank Williams and Jimmie Rodgers and the Carter Family likely avoid the lo-cal molasses to be heard on country radio today. Still, as off-putting as it might be to city slickers, the fact is that contemporary country radio covers a massive swath of the nation. I’ve come to think of it as a harmless little diversion—like petting zoos and the New York Yankees and Ted Turner and other easy-to-digest artifacts of life in the USA.

    Sure, a song like “Suds in the Bucket” or “Live Like You Were Dying” is simply high-shine pop music sung with a bizarre (and carefully calibrated) redneck twang. But for whatever it’s worth, modern country music is admirable, at least on a mechanical level. It is some of the best-written and -constructed music today, and it makes most rock and pop seem like it was written by a sixteen-year-old. (As indeed, it often is.) This does not necessarily make country good, nor rock ’n’ roll bad.

    Enthused amateurism is one of the great achievements of punk rock that was stolen from folk music, and country doesn’t want it back. In every other genre of pop music, especially rock, jazz, and hip-hop, the expectation is that a band writes and performs its own material. But even at this late date, country music continues to operate with a Tin Pan Alley/Brill Building model. A glance at the current top twenty country songs shows that more than half were written by someone other than the recording artist. Nashville is lousy with agents connecting songwriters with song performers.

    The result is a specialization of labor—call it an assembly line—which makes for an end product that has its selling points. The songwriting itself is often polished and clever, and the instrumentation and production is the best that can be had from studio session professionals. What makes most of this material sound so much like pop music is that it is seamlessly orchestrated. In other words, it’s built to the same factory specs as bubblegum pop. It should sound the same. Most of it is created the same way as “product” from Linkin Park and Destiny’s Child.

    Even with all the pop obfuscation, there are still certain conventions that signify a song as country: a fiddle, a banjo, a Dobro, or that twangy accent. With modern production and polish, though, it’s the last of these that is often the only identifying characteristic between country and, say, adult contemporary. I’ve long been obsessed with the “redneck” accent that rural populations affect from Mankato to Missoula, Atlanta to Calgary, Austin to Washington D.C. Some think of it as Southern or even Western, but it is a state of mind, not a state of place—the linguistic equivalent of the pickup truck. I’ve heard it from the mouths of cabdrivers and steelworkers in midtown Manhattan. I believe I’ve even heard it from the mouth of our Connecticut-born, Yale- and Harvard-educated, superrich president.

    Beyond country’s musical conventions, there are the hackneyed characters, themes, and storylines that still flourish like crabgrass. Country music today reflects a certain set of values that we’ve come to associate with rural life and Red America. These read like a Republican stump speech: self-sufficiency, fidelity, hard work, a firm sense of right and wrong, family values, respect for God and country. In country music, the bad guys are always irredeemable rascals who can’t give up the bottle or the wandering eye or the rambling road. In country music, the heroes are the World’s Greatest Husband and the Most Loyal Wife in the Universe—and, in times of war, American soldiers and the Almighty, who must look an awful lot like Uncle Sam.

    It’s telling that country is such a huge radio phenomenon. Country radio reaches seventy million listeners nationwide, almost half the entire adult population. Together with right-wing talk, it rules that medium. It doesn’t have as much impact elsewhere, however. Rock and pop, for example, outsell country by a long shot in the CD store and the iTunes queue. On the concert circuit, heavy metal puts to shame the box-office loot from country. But nationally, no other genre even comes close on the radio. In the Twin Cities—remember, a non-rural metropolitan area of around two million pairs of ears—country station K102 is second only to classic rock KQRS.

    Country radio is especially interesting to consider from a demographic point of view. Advertisers have known forever that modern country has a huge appeal to women, particularly suburban soccer moms. I credit all those sentimental, tearjerking odes to simplicity, fealty, and family, as well as the bitter laments about cheating, lying, rambling men. Country deals in these stereotypes comfortably and openly and can always be reduced to the essential tension between the happy home and the open road, between putting down roots versus moving on West.

    Nor is it surprising that, if all other music genres are infes
    ted with Democrats and lefties (try to imagine a conservative answer to the “Vote for Change” lineup—Ted Nugent?), country music is the bailiwick of conservatives. I despise the equivalence of “Republican” with “patriotic,” but I’m intrigued by the simple pun offered by “country” music—country as in not the city, but also country as in nation. It is a triangulation that doesn’t always make sense, particularly with the rise in the seventies of outlaw country, on the one hand, and rope-smoking hippie folk like John Denver on the other. Then again, commercial country during that period was not particularly nationalistic. Certainly not like it is today.

    If there was ever any doubt about the general political leanings of country as a whole, it was swept away in the outrage that has dogged the Dixie Chicks, ever since singer Natalie Maines made it known that she thinks George W. Bush is lower than a snake’s belly in a wagonwheel rut. It is one thing for some Euro-fag like Bono to shamelessly diss a sitting Republican president, but quite another for one of country music’s biggest stars to go all lefty. Disloyalty and dissent don’t sit well with country musicians or Republicans these days.
    If you listen very closely, you can hear Woody Guthrie spinning in his grave.

    Our Country is a strange, superficial overview of the history of country music currently playing at the Minnesota Zoo’s Imax theater. If you saw it, you had a good time, but you didn’t learn much. And you wondered why it was necessary to play what amounts to a thirty-five-minute music video (lightly salted with an instantly forgotten thumbnail history) on the world’s largest movie screen. You would have wondered why such a shallow treatment of such a massive subject needed to be told with six-story-high images, and you would have been left with the main impression that it was in order to show you Lee Ann Womack’s breasts the size of two Harvestore silos.

    The main attraction was the music, of course— some of it good, some of it atrocious, most of it pretty conventional, all of it contemporary. More than a hundred current stars make cameos (Dwight Yoakam, Lyle Lovett, Alan Jackson, Crystal Gayle, Loretta Lynn), playing standards or dressing up like dead heroes such as Patsy Cline or Hank Williams. The overall effect of this long, uneven exercise in “Where’s Waldo?” is one of penance-paying. For all the depredation they have visited on the genre, particularly from the stage of the Grand Ole Opry, the unidentified stars of the movie take their turns playing real country, or a semblance of the same. Country music, like so many other things these days, relies on a reputation for being simple and real, but it’s just show business after all, and just about as fake as a three dollar bill.

    Maybe the strangest aspect of the film is how it compresses the origins of country music into a single, breathtaking, narrative-free panoramic shot of what is supposed to be Ireland, but looks suspiciously like New Zealand. Apparently, fiddles and pennywhistles made it to the New World by way of a single desperate Irishman who had his Da’s fiddle pressed on him as he shipped for Ellis Island. Thirty years later, Jimmie Rodgers invented country music somewhere in America, and you eventually get Willie Nelson, voilà!

    Needless to say, there isn’t much of a storyline to this history. It’s simply a diversion between scenes in the real show—for example, an astonishingly decadent, “We Are the World”-style jam to Pete Seeger’s “Turn, Turn, Turn,” starring everyone from Dolly Parton to Roger McGuinn. A lovely song, but one that’s hardly relevant to the origins of country music, and not considered part of the contemporary canon. There are so many stories to tell along the way—from how country begat rockabilly which begat rock ’n’ roll, for example, or how gospel and swing were folded into various forms of country. As I say, it’s a huge story, and probably one that can’t be told in any amount of footage of any caliber.

    Like modern country music itself, Our Country is pretty inoffensive. It could have been a lot worse. Even the film’s attempt to link country music with God-fearing patriotism is so slight and random—“This generation had its own Pearl Harbor” (September 11), cue “Living in the Promiseland”—as to seem absurd. I guess I can continue to ponder the paradox of what necessary connection there is between conservative politics and country music, and I won’t let it bother me that Lee Ann Womack’s barn-size breasts heave in my face as I do so. That, I think, would be un-American.

  • Fear Factory

    American solipsism is a funny thing. Each of us tends to believe two contradictory destinies await us: On the one hand, incredible luck and wealth will eventually be ours. On the other hand, violent tragedy is one terrorist strike away. This is the American Dream gone to seed and become a psychosis.

    We each believe we are the star of our own made-for-TV movie, which has helped us arrive at a twisted understanding of probability. We know, because we saw it on TV, that someone has to win the lottery, and someone has to be the next victim of violence. Naturally, it may as well be me. This despite all evidence to the contrary.

    If we looked earnestly at our own lives, minus the commercial breaks, we might see that life in general is pretty mellow, its pleasures and pains mostly subtle. There is not a great chance that a hijacked plane will land on your house, or that a van will arrive with a gigantic cashier’s check. In fact, there is no meaningful chance at all. There is a whole world out there that operates independently of our own routines, pleasures, and impediments. We sit down for the ten o’clock news, and the TV collapses time and space; we can’t help fearing the worst and expecting the best for ourselves.

    Sadly, you will not win the lottery. But on the bright side, neither will you be attacked by a terrorist. Terrorism is not about reality, it is about perception. In other words, it is about media manipulation. Anyone serious about terrorism must recognize that media coverage is not the solution to terror; it is terror’s best tool. What would happen if our newspapers relegated all news of terrorism to the back pages? What if the only people who took notice of terror alerts and webcam beheadings were secret government agencies in a position to do something about them? Minnesota Secretary of State Mary Kiffmeyer recently insisted on posting terrorism warnings at Minnesota polling places. She, like any purveyor of terror, wanted you to embrace your fear, no matter how irrational it might be. We haven’t worked out the math, but we’re pretty sure that odds are significantly better that you will be struck by lightning than by a terrorist on November 2. (Statisticians say the annual odds of a lightning strike on an American are 300,000:1.)

    If you insist on being motivated by fear, then it may be more realistic to worry about how you would pay the deductible for an emergency appendectomy. Or how you might find the cash for your children’s orthodontics. Or you might agonize about how your full-time job at minimum wage still puts you well under the federal definition of poverty. (Did we mention that you may no longer qualify for overtime, thanks to new federal regulations?)

    But if these boogeymen are still not sufficiently scary, and you crave the fear you can get only from the Fox News Channel, you might consider that there are twenty-five thousand homicides committed with guns each year in your homeland. There are thirty thousand suicides each year in your homeland. If your main worry is terrorism, you might consider that we are effectively terrorizing ourselves—and not a kaffiyeh in sight.

    Still, we understand that nothing terrorizes like the idea of an enraged Islamic fundamentalist on American soil. So in the interest of fomenting that highly specialized brand of fear, we’ll point out that the War on Terrorism has actually increased the incidence of Islamic terrorism worldwide, not decreased it. This should not be surprising for one simple reason: When anybody takes the trouble to ask, Islamic outliers say their essential complaint is the presence of American “infidels” in Muslim lands. Now consider that your government’s approach to eliminating Islamic terrorists has been to do precisely what angers them most, and what best animates their recruitment efforts: forcibly occupying the world’s most ancient Islamic enclaves.

    But seriously. Can these Al Qaeda nut-jobs reach you? No. If you’re going to leave the house with a tinfoil hat on your head and a handgun in your lunchbox, at least be afraid of the right things for the right reasons. And know that if the lightning doesn’t get you, your own gun probably will. At least your bereaved children will have your lottery winnings.—Hans Eisenbeis

  • Foot in Mouth

    We were surprised when organizers of the Twin Cities Marathon decided last spring that they would henceforth allow only Americans to win our race. This, of course, instantly got up the dander of non-runners, liberals, and non-running liberals throughout the area. Within days of the announcement, the Minneapolis City Council was looking into ways to “punish” Twin Cities Marathon, Inc. Among people who run only once every four years, it looked like discrimination. And so it was.

    To be fair, though, the problem depended entirely on your point of view. City boosters who think of our marathon as a great preening moment—the same people who repeat the unsourced compliment that ours is “the most beautiful urban marathon in America”—understandably believe that our footrace should reflect our values: racial diversity, inclusivity, doughnut holes at all water stops, and so on. But to serious American marathoners, it is a salve to the national ego to win in our own backyard now and again. The Kenyans and Russians can stay at home and dominate their own marathons, thank you very much.

    This is not strictly a local attitude. There are more than three hundred marathons each year in the U.S., and aliens have worn out their welcome at nearly every one of them. For native joggers, winning is not everything—losing is. An American has not won any of our three biggest marathons in two decades. Roberto Salazar won the New York City Marathon in 1983. Greg Meyer won Chicago in 1982 and Boston in 1983. And that was our last hurrah, at least among the marathons that matter.

    See, there’s the rub: We desperately want the Twin Cities Marathon to matter. We want a unique selling point. Boston, New York, and Chicago are the granddaddies of all marathons, and they hardly need to distinguish themselves. Each admits fields in excess of thirty thousand runners. By contrast, we cap ours at ten thousand, we tell ourselves we are beautiful, and we try to keep foreigners out.

    Still, in our own way, we’re just following the cues of the Big Leagues. New York, Chicago, and Boston have all trotted out proposals intended to give a second wind to American pride. Last year, New York introduced the Salazar Award to the top-finishing Americans. Chicago’s plan to double the purse for American winners had the interesting effect of causing a top Kenyan runner to apply for and gain citizenship and continue winning. (If he’s going to be a millionaire, he may as well be an American.)

    It is admirable that Twin Cities Marathon organizers softened their position. Competitive runners can certainly use every boost they can get in terms of reputation, since they are the fitness world’s cock-of-the-walk. Following the aspersions of the Minneapolis City Council, TCM directors decided to offer a general purse for all runners, regardless of which godless country they might come from, and a separate purse for top American finishers.

    The Twin Cities Marathon does have a unique selling point, as it turns out. We have been designated the site for the USA Track & Field National Championship for the next two years, and we have hosted the National Master’s Championship for the last fourteen. There was logic behind efforts to institutionalize a policy of exclusion.

    But one should be careful not to break what isn’t fixed. The relatively modest size and location of the Twin Cities Marathon has itself guaranteed a cup of hope to patriots and xenophobes. The truth is, not many foreigners come here, so our odds are good. In fact, a natural-born American won in 2002. (Eddy Hellebuyck, last year’s winner, has American citizenship, but is Belgian by birth.) He was Dan Browne, a Californian, who handily beat dozens of foreigners. But we’ll be keeping an eye on him and his kind. If too many Californians win our race, we can always limit the field to non-Californian Americans.

  • Lofty Ideals

    We were a little disturbed by the explosion. All around the city, there was a sudden, violent eruption of elegant apartments, lofts, row-houses, and condominiums. And it wasn’t just along the Mississippi or in the Warehouse District. It was where a gas station had stood at Fiftieth Street in Linden Hills. It was where something unremarkable had failed at Lake and Bryant. It was in a liquor-store parking lot at Nicollet and Franklin. It was even cropping up on lackluster strips in first-ring suburbs like St. Louis Park and Richfield. What were the developers smoking? When was the population of the Twin Cities overrun by turtlenecked young executives with seven-figure checking accounts and an aversion to mowing the lawn?

    They say the real-estate industry is recession-proof, but this felt like a powerful case of denial. The economy had soured, employment figures took a dive, higher interest rates thundered on the horizon, coffins trickled out of Iraq, and the country threatened to come apart along the red-blue seam. Meanwhile, Twin Cities contractors built ten thousand new “units.”

    Sometimes our best impulses and our worst converge, and the result is happy. This colonization of the cool may seem wasteful and excessive and unneeded and vainglorious. It may even be morally suspicious; certainly this is not the low-income, affordable housing we’ve been promised for years now? But we should count our blessings and try not to be so disagreeable.

    We find that the same sour people (in other words, we ourselves) who are complaining about “urban sprawl” and the loss of “green space” are the ones who feel uneasy about the urban building boom. But when we do our exercises to eliminate our affliction with jerky knees, we realize this is precisely what is needed. If we are serious about putting a lid on the tract mansions of Farmington, and about bringing beautiful people back to the city, then we will have to find some sugar to take with this medicine. These developments are creating what city planners call “density.” That is, more people living in a smaller amount of space. It is what distinguishes a city from a town or a village or a suburb. It is not necessarily a bad thing.

    It is probably true that native Minnesotans are constitutionally turned off by population density. We are essentially a rural people—many of whom have freshly left the farm for the city (but not too much city, if you please). It could be that we are basically misanthropes who prefer to be alone. After all, some have interpreted Minnesota Nice as an icy-smiled predisposition to hate the stranger and the strange.

    Still, let’s not forget that we do have a grand tradition of communitarian spirit. Officially, we care about each other, and we do our best to respond to neighbors in need, and there are times when personal gain does take a rear seat to civic pride—whether it’s light rail or the St. Paul Saints or a web-link to Canadian pharmacies.

    In the last census, the Twin Cities ranked twenty-second among large American cities in population density. Minneapolis contains about seven thousand people per square mile. St. Paul, our narcoleptic capital city, has a bit more elbow room, with around five and a half thousand people per square mile. But with high-density housing “units” springing up like mushrooms all over the Twin Cities, we can be sure that there are more of us fitting into the same space. Perhaps we’ll learn how to get along more earnestly and take care of each other better—and the land given over to Farmington McMansions can be plowed into green meadows once again. Living among irritatingly rich, chic, lawnless people is a small price to pay for the greater good.

  • Don’t Be So Literal

    We probably hang around with the wrong kind of people. For most of June, there were complaints about the American flag. How long did it have to be at half-mast out of respect for a deceased president? This was mostly sour grapes from certain people whose memory of President Ronald Reagan was less rosy than what they were seeing on TV, hearing on radio, and reading in magazines. Oddly enough, we heard it from a number of small-business owners, who began to ask each other, “How much longer?”

    These grumpy folks had good intentions. After all, they wanted to do the right thing, within the limits of the law, while not compromising their own personal views. It is worth pointing out now what that rule is: It is required that flags be flown at half-mast for thirty days after the death of a former president. Yes, even Richard Nixon got the treatment. (Though we wouldn’t be surprised if many flag-fliers declined to make this tribute back in May 1994—even after Nixon’s apologists, including President Bill Clinton, had convinced us that his main crime was getting caught. You know, he normalized relations with China!)

    When we don’t know what the rule is, we worry that we might be offering a hollow tribute. We don’t want to go through the motions only because of peer pressure from others who have an unusually elevated and amplified view of the deceased. Also, we’d rather not be confused for the brown-noser who jumps twice as high as he is asked to.

    No worries there. We were simply doing our duty, whatever our affiliations. Flags all the way back up now, thanks.

    In this case, observing the rule to the final hour of the final day was right as rain. There were few people as literal-minded as President Reagan or his most ardent fans. The man who saw the world in black and white—or perhaps in red and white—had a base of support in the Christian right. If there is a single unifying shibboleth of this self-assured and self-righteous bunch, it is their view that the Holy Bible says what it means and means what it says, and “interpretation” is a fancy word for “making stuff up.”

    It is agreeable to believe that the world divides neatly into good and bad, right and wrong, America and everyone else—and to swear on a stack of Bibles that this is so. President Reagan’s unflappable faith in our nation’s divine right was as refreshing in 1980 as it is today, even though it was a dangerous delusion. Was and is. The present version is deadly, in case you haven’t been watching the news lately.

    The problem with a literal interpretation of the Bible, the world, or the rules pertaining to the flag is that reality simply doesn’t work that way. Many thank Reagan for the fall of the Soviet Union. To be sure, it was an almost inconceivable and abrupt end to a paradigm—the Cold War—which had hung over our heads like a black cloud for half a century. But what killed Soviet communism was Soviet communism itself: the literal-minded view that one communitarian system applied to all people at all times in all places. The beauty of capitalism is that it recognizes that people aren’t wired that way. In fact, it has no idea how people are wired, and merely tries to maximize the opportunities for variety. A political and economic system that cultivates individualism—messy, figurative, diverse—is so much more real, it’s a wonder why we ever bothered about the Reds.

    But perhaps the lesson was never learned. The U.S. itself has slid into an egregiously simple view of itself and the world. Our politicians believe it is semantically impossible for our foreign policy to be wrong, and too many of our people believe in the statistical impossibility that we are all destined to be in the top one percent of the “have-mores.”

    It’s OK to be literal about the half-mast ruling. But we are gratified that most Americans neither know nor observe the dozens of other regulations circumscribing the display of Old Glory. As a nation, we eventually reject systems and dynasties, and embrace details and individuals. We are a nation built for the smorgasbord rather than the prix fixe, and this is why we ultimately will prevail over enemies both foreign and domestic. At least that’s the idea.