12/29/07

Young people are deserting newspapers in droves. What better way to win them back than this?

AOL is finally shooting the Netscape browser in the head. Former Netscape developer Asa Dotzler says good riddance.

12/19/07

Friendblogging twofer: Reyhan writes about Jake's movie! And I'm the one who gave her Jake's email! I'm a degree of separation! Everybody who enjoys laughter: go see Walk Hard!

12/15/07

Like a telephone but for strangers

Maybe three years ago, during a random clicking-around session, I found myself reading a blog that I'm not going to name. It was written by a young schoolteacher who'd moved to a town where she didn't know many people. She wrote about the difficulties of teaching, and how much she missed her friends, and the ways she filled her time. She was a good writer and more specifically a good blog-writer, funny and harsh and immediately trustworthy, and she wrote as though she were talking to a small group of close friends, which for the most part she probably was. But she was also talking to me.

I checked this blog regularly, and I learned more about this person than one usually learns about a non-famous stranger. Once, writing about (I think) babysitting her neice, she wrote that, at times, it made her glad that her womb was a rocky place where seed can find no purchase. I thought about her sometimes. I was rooting for her.

It's a strange relationship that we can now have, over the internet, with people who don't know we exist. Choreographed self-revelation is a particular and very specific skill. It has nothing to do with what I'm trying to do on this blog, really, although I guess there was an element of it in my restaurant reviews. Emily Gould can do it while writing a media-gossip blog, and glenn mcdonald could do it while reviewing records, but this person I'm talking about did it uncut.

Then she wrote something about how the kids she taught had discovered her blog, and purged everything related to sex/drinking/drugs, which was probably between 40 and 60 percent of the content. A little while later, the site went dead. A while after that, the URL turned into linkspam.

It troubled me that I wasn't going to find out what was happening with her. It was like having an old friend, someone you don't see much, go into the witness protection program: they're still out there, but now you don't know where. I don't think I ever knew her last name.

I was thinking about her this afternoon for some reason, and I got curious, so I Googled the name of the website, and I found the blogs of a bunch of people who seemed to know her. (She has a very bloggy social circle.) And I poked around on their blogs for a while, and I discovered this, which is clearly hers. It's less than two weeks old.

It's about her students, rather than her. It's a bit too kids-say-the-darndest-things for my taste, not that anyone involved has any reason to care about my taste. It doesn't tell me much about how she's doing, except that she's still alive and still teaching and keeping her sense of humor in the face of everything. But that's better than nothing.

Update: See the comments.

Matt Bai makes a really good point about trying to predict how a Republican governor will act in the White House:

We in the media have historically embraced the story of the Republican governor who, it turns out, isn’t as much of a crazy conservative as you might think. Hey, look, the Tin Man has a heart! George W. Bush was a “compassionate conservative” who worked amiably across the aisle with Democrats. Mitt Romney passed a landmark, bipartisan bill to provide healthcare. Even Ronald Reagan enacted a huge tax increase while governor of California. And so on.

There’s just one problem with this formulation, which is worth remembering if Huckabee pulls off a remarkable win on primary night in Iowa: it is a serious misreading of conservative doctrine and a lousy predictor of what’s to come.

Here’s why: ... Under the doctrine of federalism, the government in Washington is supposed to remain meek and disengaged in domestic affairs, leaving policy and funding decisions primarily to the states.... That’s the whole point of the conservative exercise—to make state government set priorities and scale back waste and unnecessary commitments. You’d be hard pressed to find a Republican governor ... who hasn’t had to raise some tax or fight for a worthy social program. That’s just what governors do—especially since almost all of them are bound by law to balance their budgets.

When these governors get to Washington, though, that’s a different story—and there’s nothing inconsistent about it. Then their job, as they see it, isn’t really to govern anymore, but to whack at the hopelessly gnarled federal bureaucracy and push the burden for domestic programs back to the states, where it belongs. So while Bush may have been a pleasant enough conciliator and dealmaker in Texas, he never for a minute confused the demands of that job with the one he had taken on at the White House. And neither, one can presume, would Mike Huckabee. He may have been a reasonably centrist governor, but he’d be a starkly conservative president.

12/14/07

Microsoft's PlaysForSure brand -- the logo that identifies whether a particular non-iTunes online music service will work with a particular non-iPod mp3 player -- has now been renamed Certified for Windows Vista. Even though it has nothing to do with Windows Vista. I guess they wanted to capitalize on all that successful Vista branding. Oh, wait.

As Ars Technica puts it:

Microsoft's PlaysForSure has always been a model of how to run a DRM ecosystem: launch a new scheme with logo, convince device makers to sign up, launch your own online music store that uses said ecosystem, drop your music store, launch your own device which uses incompatible DRM, launch new music store with same incompatible DRM, then change branding of ecosystem logo. On second thought, perhaps there's room for improvement here.
Gotta love the private sector -- it's so efficient and market-driven.

Remember the (very minor) point I made about Nancy Franklin last night? Well, that very same day, Joshua Clover made essentially the exact same point about Franklin's review of Gossip Girl:

Franklin does seem intimately familiar with the zip code in which the show purports to reside.... At the same time, she seems surpassingly oblivious to the culture-consuming world beyond 10028, and what's happening out there. She mentions the show's "primary purpose of marketing pop songs, which are heard throughout." Actually, we're pretty sure that an untested show on the CW isn't marketing "What Goes Around" by a non-New Yorker named Justin Timberlake, and the like. Mr. Timberlake, who comes from a land down under 72nd Street (it's called "Tennessee") may in fact, along with his staff and servants and holding company, be receiving a certain fee for his participation. We are just guessing.
This pattern (two examples is so a pattern) suggests that, to Franklin, whatever's on TV is by definition the white-hot center of the media universe, and everything else (some website, some pop song) is desperate to bask in its reflected glory.

And in case you care way, way more than you probably do: my original point was challenged (by Jossip Jirl herself) in the comments.

Via DF, a 1996 interview with David Foster Wallace on Infinite Jest:

Part of the stuff that was rattling around in my head when I was doing this is that it seems to me that one of the scary things about sort of the nihilism of contemporary culture is that we're really setting ourselves up for fascism. Because as we empty more and more kind of values, motivating principles, spiritual principles, almost, out of the culture, we're creating a hunger that eventually is going to drive us to the sort of state where we may accept fascism just because -- you know, the nice thing about fascists is they'll tell you what to think, they'll tell you what to do, they'll tell you what's important. And we as a culture aren't doing that for ourselves yet.

12/13/07

New Yorker TV critic Nancy Franklin is often pretty good (she totally gets Friday Night Lights, for instance), but her column this week reveals an amusing old-person misconception. Near the end of the piece, Franklin writes that "the CNN/YouTube debates were a great promotional device for YouTube."

According to Internet World Stats there are 1.25 billion people using the Internet. According to Alexa, 17.25 percent of them look at YouTube sometimes. That means there's a total of 216 million people watching YouTube. Meanwhile, the Republican debate was watched by 4.4 million people. If anything, the debates were a great promotional device for the American democratic process; YouTube certainly doesn't need the help.

12/12/07

You have probably been wondering, "What is fringe Republican presidential candidate Ron Paul's favorite superhero?"

The answer, predictably enough, is Berlin Batman -- the alternate-universe Batman of Weimar Germany, who prevented the work of protolibertarian economist Ludwig von Mises from falling into the hands of the Nazis in Batman Chronicles 11.

Why I love New York: So some folks get on the Q train, and some other people say "Merry Christmas!" and the first group of people, who it turns out are Jewish, reply "Happy Hanukkah!" So the first group of people attack them and beat them up.

But! Another passenger -- a "good Samaritan," if you will -- intervenes and helps the Jews fight off the Christians. And he turns out to be a Muslim! And the Jews and the Muslim fight off the Christians together! And then the police come and arrest the Christians, and the Jews invite the Muslim over to celebrate Hanukkah the next night, and everyone is happy, except for the Christians, who are facing assault charges. The end.

12/11/07

It turns out that, if necessary, you can say "Batman is not just a river in Turkey." If it ever comes up.

12/10/07

Remember the first time you saw The Office (the British version)? Remember how Ricky Gervais's portrayal of David Brent made you feel physically uncomfortable, like you squirmed around in your seat and covered your eyes?

Well, check this dude out.

Weirdly long and poetic article from the Washington Post on how Apple Stores are getting more crowded:

The question so recently was: What is the Apple Store doing to us, as a people?

Now the question is: What are we doing to it ?

Can you smother a store to death?

12/6/07

Radar has a good summary of the recent Joe Klein FISA bill controversy, for those who don't have time to read 50,000 words by Glenn Greenwald.

12/4/07

Friendblogging redux: Jake, with John C. Reilly, on Fresh Air, talking about Walk Hard: The Dewey Cox Story, which by the way is fucking hilarious.

12/3/07

Amazing NYT front-pager yesterday: For 20 years, the World Bank and the western countries that give aid to Africa have been demanding that recipient countries not provide subsidies to farmers. Instead, the western donors have advocated a "free-market" paradigm in which poor countries grow cash crops instead of food, then buy food from rich countries (which massively subsidize their own farmers). Without subsidies, African farmers can't afford to buy fertilizer, which means they can't afford to grow food.

Two years ago, after a poor harvest led to a devastating famine, Malawi's president, as the Times puts it, "decided to follow what the West practiced, not what it preached," and reinstated fertilizer subsidies. Now Malawi is selling surplus corn to Zimbabwe, and UNICEF is sending the powdered milk it has stockpiled in Malawi to Uganda instead.

It's an incredible indictment of western aid policy and the failures of free-market dogma. Those who read to the end will be rewarded with a scene in which a village chief performs "a silly pantomime."

11/23/07

The song remains the same

David Brooks has addressed himself to the subject of rock music, and as you'd expect it's a total Rothgasm.

Brooks's problem is that rock is no longer a monolithic entity centered on megastars like the Rolling Stones, Led Zeppelin, or Bruce Springsteen, because "there are now dozens of niche musical genres where there used to be this thing called rock." In making his case, Brooks pulls off a pretty incredible rhetorical trick. Watch as he explains the fragmentation of the pop-music audience:

Music industry executives can use market research to divide consumers into narrower and narrower slices... And there’s the rise of the mass educated class. People who have built up cultural capital and pride themselves on their superior discernment are naturally going to cultivate ever more obscure musical tastes. I’m not sure they enjoy music more than the throngs who sat around listening to Led Zeppelin, but they can certainly feel more individualistic and special.
In other words, the fact that people are listening to a variety of different musicians and genres indicates that they are both (a) sheep who have been brainwashed by "music industry executives," and (b) posers eager to show off their specialness. Whereas back when everyone was grooving on Led Zep together, they were all free-choosing individuals, immune to marketing and peer pressure. Yes, that makes sense.

Brooks trots out Steven Van Zandt to bolster his credibility, but Van Zandt doesn't seem very interested in Brooks's fragmentation narrative. He makes a different argument: the familiar "today's music sucks in comparison to the music that was popular during the years when I was fourteen through twenty-two, which happened to be the greatest music ever made" argument.

Van Zandt "argues that if the Rolling Stones came along now, they wouldn't be able to get mass airtime because there is no broadcast vehicle for all-purpose rock." This is a bit like saying that if World War II were fought today it would be over in five minutes because the Germans would be on the same side as the British and the Americans. If the Rolling Stones came along now, they wouldn't be able to get mass airtime because everyone would think they were ripping off the Rolling Stones.

Hilariously, Van Zandt has
drawn up a high school music curriculum that tells American history through music. It would introduce students to Muddy Waters, the Mississippi Sheiks, Bob Dylan and the Allman Brothers. He’s trying to use music to motivate and engage students, but most of all, he is trying to establish a canon, a common tradition that reminds students that they are inheritors of a long conversation.
Good idea, Miami Steve -- let's sit the kids down in the classroom together, the ones who listen to Justin Timberlake and the ones who listen to Radiohead, the ones who like Lil Wayne and the ones who like the Get-Up Kids, the one who's into Ornette Coleman and the one who's into Deicide and the one who just borrowed her parents' Dylan tapes, and let's explain to them that they should all start listening to the Allman Brothers, because they are the inheritors of a long conversation that culminates with bearded white men making their guitars go hoodly-hoodly-hooooo! I bet that'll work real well.

11/22/07

P. Z. Myers on this week's stem-cell breakthrough:

Americans did not make this discovery; Japanese researchers did. It required understanding of gene expression in embryonic stem cells, an understanding that was hampered in our country. It's going to require much more confirmation and comparison between the induced pluripotent stem cells and embryonic stem cells as part of the process of making this technique useful....

We are not going to be able to grow new organs and tissues for human beings from a few skin cells using this particular technique. It's going to take more work on embryonic stem cells to figure out how to take any cell from your body, and cleanly and elegantly switch it to a stem cell state that can be molded into any organ you need. What this work says is that yes, we'll be able to do that, it isn't going to be that difficult, and that we ought to be supporting more stem cell research right now so we can work out the details.

Or we can just sit back and let the Japanese and Europeans and Koreans do it for us, which is OK, I suppose. Just keep in mind that ceding the research to others means giving them a head start on the development of all the subsequent breakthroughs, too, and that what we're doing is willingly consigning U.S. research in one of the most promising biomedical research fields ever to an also-ran, secondary status.

11/19/07

Can you spot the item that doesn't belong in this series?

And with Hillary's presidential bid, Condi as secretary of state, and an updated ass-kicking Bionic Woman on the air waves, one cannot say we are experiencing a "silencing of women's voices."

11/15/07

Headlines that might at first seem to come from the Onion, but in fact come from the Daily Telegraph: "Surfer dude stuns physicists with theory of everything."

11/13/07

Aww, thanks! Howard Dean thinks Jews can go to Heaven. (He's wrong, of course, but we appreciate the sentiment.)

11/11/07

Maureen Dowd has been punting a lot lately. Today she turns her column over to Saturday Night Live head writer Seth Meyers to do gags about the TV writers' strike. But there's something off about this bit:

As a comedy writer, I am more than willing to admit that I need a world with producers, but do they need us? The answer is yes, for two reasons. First, without writers whom will the studios blame for their failures? Second, seriously, whom?
Does anyone else detect the heavy hand of the Times copy desk here? Or did Meyers really land the joke on whom rather than who?

11/10/07

You heard it here first: RoBros, 11/1/07; NYT, 11/11/07.

11/7/07

Predictable: Type "onion" into Google, and the first result is The Onion.
Less predictable: Type "the" into Google, and the first result is also The Onion.

[Via AlterNet]

11/6/07

Least comforting assertion of the day: From David Brooks's NYT column:

The Bush administration is not about to bomb Iran (trust me).

11/5/07

If you skipped Anthony Lewis's NYT Book Review essay on two recent Bush books, good call -- Lewis is clearly one of those old guys who can no longer identify what counts as common knowledge. (Random cliché sampling: "The one clear winner from the invasion and the consequent civil strife has been neighboring Iran ... Bush seems to lack the intellectual curiosity that makes for an interesting mind ... there is another, less attractive part of the Bush persona: the mean-minded frat boy ... what I think will be seen, along with the Iraq war, as the most important legacy of Bush’s presidency: his effort to enlarge the unilateral power of the president.")

Lewis wraps up this bloviation with a conclusion that's off-base on two counts. He writes:

There is a profound oddity in the position of the presidentialists like Yoo, Cheney and Addington. Legal conservatives like to say that the Constitution should be read according to its original intent. But if there is anything clear about the intentions of the framers, it is that they did not intend to create an executive with more prerogative power than George III had.
I'm not sure this "oddity" really exists. First of all, the statement "legal conservatives like to say that the Constitution should be read according to its original intent" is almost exactly wrong. The poster boy for originalism, Antonin Scalia, in his "Theory of Constitutional Interpretation" speech, said this:
You will never hear me refer to original intent, because as I say I am first of all a textualist, and secondly an originalist. If you are a textualist, you don't care about the intent, and I don't care if the framers of the Constitution had some secret meaning in mind when they adopted its words.
But even if Lewis had accurately represented the views of conservative originalists, that wouldn't mean that executive-branch-supremacists like Dick Cheney and David Addington were hypocrites. As far as I know, neither is specifically associated with constitutional originalism. (John Yoo is a slightly more complex case.) Some conservatives believe in being faithful to the constitution. Some believe in letting the president do whatever the hell he wants at all times. This is not a "profound oddity"; it's an easily observable fact.

Am I wrong about this? People who actually know something about the law, please let me know.

How can it be wrong when it feels so right?

Now available for the Wii: the legendary sequel to Super Mario Bros., never before released outside of Japan. A Nintendo spokesman once suggested that Shigeru Miyamoto might have been depressed when he created it, a claim also made about Shakespeare at the time of King Lear. Chris Suellentrop in Slate:

Again and again, the game uses your familiarity with Super Mario Bros. to subvert the playing experience.... In most games, you trust that the designer is guiding you, through the usual signposts and landmarks, in the direction that you ought to go. In the Real Super Mario Bros. 2, you have no such faith. Here, Miyamoto is not God but the devil. Maybe he really was depressed while making it—I kept wanting to ask him, Why have you forsaken me?

That sadistic torment, however, is central to the game's appeal.... The Real Super Mario Bros. 2 isn't just hard—it's "difficult," like a book or a movie that initially rebuffs you but becomes rewarding as you unlock its secrets.
Or this, from a review on a gaming site:
You must stay alert, concentrated, and you absolutely have to be open to the forced evolution of your style of play. The game designers are out to screw with your head and if you keep the right attitude about you, you’ll find yourself entering a hilariously intimate unspoken conversation with them.... What the game does expertly is lull us into a platformer complacency where we’ll speed along at top clip expecting the game to provide openings and landings for our jumps. Just when you’re at your most comfortable and you’re straddling that controller and spanking its side like you own the world, it’ll slam your face into a brilliantly placed yet avoidable enemy. It shows you the aporias in your game playing philosophy that you didn’t even know existed.

11/4/07

"'Keyboard shortcuts are faster' is a myth" is a myth: In 1992, Tog wrote:

The test I did I did several years ago, frankly, I entered into for the express purpose of letting cursor keys win, just to prove they could in some cases be faster than the mouse. Using Microsoft Word on a Macintosh, I typed in a paragraph of text, then replaced every instance of an "e" with a vertical bar (|). The test subject's task was to replace every | with an "e." Just to make it even harder, the test subjects, when using the mouse, were forbidden to just drop the cursor to the right of the | and then use the delete key to get rid of it. Instead, they had to actually drag the mouse pointer across the one-pixel width of the character t o select it, then press the "e" key to replace it.

The average time for the cursor keys was 99.43 seconds, for the mouse, 50.22 seconds. I also asked the test subjects which method was faster, and to a person they reported that the cursor keys were much, much faster.
I have just duplicated Tog's experiment, also using Microsoft Word on a Macintosh. I used a 94-word sample and timed myself with Minuteur. Using the cursor keys took 93 seconds; using the mouse took 239 seconds.

Tog's research is at least 20 years old. It may have been relevant when keyboard shortcuts and computer users were both less advanced than they are now, but those days are gone. And yet the estimable John Gruber linked to Tog's column last week, as though it were something for contemporary users and developers to keep in mind. Someone cites it in a comments thread here. Squelch this revaunchist nonsense before it goes any further! Keyboard shortcuts work!

11/3/07

This probably shouldn't be as surprising as it is: A Slate intern named David Sessions is an evangelical Christian. He's written a piece criticizing David Kirkpatrick's Times Magazine cover story on the crackup of the Christian right. The "Christian left," according to Sessions, is an overhyped fringe movement that "gets more attention in the press than it does in the mainstream evangelical community," and the fact that younger Christians have other concerns beyond abortion and homosexuality doesn't mean they're poised to abandon the Republican Party. Sessions's piece is kind of depressing, obviously, but I suspect it's closer to reality than Kirkpatrick's rosy take.

It's strange how strange it seems that Slate has an intern who's an evangelical Christian.

Just read (via DF) this 1989 article by Apple human interface guru Bruce "Tog" Tognazzini. In a nut:

We’ve done a cool $50 million of R & D on the Apple Human Interface. We discovered, among other things, two pertinent facts:

* Test subjects consistently report that keyboarding is faster than mousing.
* The stopwatch consistently proves mousing is faster than keyboarding.
This had a big impact on me. I've been a keyboard-shortcuts guy ever since my first job, where my boss would stand over my shoulder and correct me when she saw me reach for the mouse. Now the first thing I do in a new app is train myself to use the key commands, and I've created custom shortcuts in all the apps I use frequently (e.g. in Word, Command-Option-W for Word Count), and I use Quicksilver to launch apps, open files, search Google, send email, get lunch, basically everything. All of this keyboarding makes me feel very efficient. And now here's Tog himself bringing my world crashing down around me.

But when you think about it, it can't be as simple as Tog suggests. The blanket statement, "Mousing is faster than keyboarding" is, presumably, true in certain circumstances. But it can't be true always and everywhere.

I spend a lot of time writing in Word. (I know, I know, but I'm used to it.) I try to write 2,000 words a day, and although I don't always manage it I usually get close enough. Based on a random sample of my prose, that's about 10,865 characters. I enter almost every one of these 10,865 characters into a Word document using the keyboard. According to Tog I should be able to save time by finding the character in the Symbol dialogue box and clicking on it.



Maybe that's a facile example. Tog might say, "Of course, I didn't mean typing words. That's what a keyboard is for. I meant performing other actions."

So here's an example that's more on point: saving. While I'm writing my 2,000 words, I am a saving freak. I save my document reflexively. Whenever I'm not typing, I'm saving. I'm sure I take saving to a useless and neurotic extreme, but it's a harmless neurosis -- the computer can handle all that saving, and it removes a source of worry, and I never have those I just lost two hours' work things that happen to other people.

I do all this saving using the venerable Command-S. I did it just now, after typing that last sentence, autonomically: hands in the resting position, left thumb about an inch to the left (I'm left-handed), left ring finger down. Boom, saved. Not once do I think about the Command key or the S key, just as I don't think about the Shift key or the T key when I begin to type Tog.

I could, instead, use the mouse to go to the File menu's Save command, or to the Save button in the toolbar. I find it hard to believe that would be quicker, but perhaps I'm falling victim to Tog's first point and failing to accurately register the time it takes to hit Command-S. So let's abstractify a little. I can't say for sure how fast I am at hitting Command-S, but I'm definitely faster than I was when I started using a computer. I'm faster than the average computer user, just because I do it so often. Either the speed of mouse-saving is like the speed of light, and there's no way you can ever catch up with it, or at some point I'm going to be faster with the keyboard than with the mouse.

Abstractify one layer further: If a keyboard shortcut is used frequently enough, and the buttons used are convenient and memorable enough, and the mouse alternative is sufficiently complex (identify the Save button from all the buttons on the toolbar, find the cursor, land the cursor on the Save button, click, return hands to the keyboard), then the keyboard shortcut is quicker and less distracting. If I only saved once a day, and the shortcut was Control-Option-Y instead of Command-S, and Microsoft had made the Save toolbar button twice as big, and there were no other buttons next to it on the toolbar, then using the mouse would be quicker.

And what about more data-dense applications? When I'm editing audio in Pro Tools and I need to move my cursor to a particular spot, there are 44,100 possible cursor locations per second of audio and maybe five minutes of audio represented on my screen. I can try to find that spot with the mouse, using repeated clicks of the zoom button, recentering, squinting at the waveforms, then unzooming back to the original view. Or I can hit the Tab key and, using Pro Tools's Tab to Transient feature, allow the software to find the exact spot I need. Is the mouse quicker then?

Tog wrote his piece in 1989, the year the first version of Pro Tools (then known as Sound Tools, which is a better name) was released. He can't be blamed for not knowing about high-resolution audio or video editing. Still, one wonders about the $50 million worth of testing he did. Did he test on anyone who'd spent ten years hitting Command-S as often as I do?

In fact, the answer to the question Which is faster, keyboard or mouse? is not Tog's one-size-fits-all answer (the mouse, and testing proves it!), nor the answer of my old boss (the keyboard, and get your hands off that mouse!). It's For what user, attempting to accomplish what task, under what circumstances?

Update: This 2005 paper (PDF) from the International Journal of Human-Computer Interaction comes down squarely on the side of keyboard shortcuts.

11/1/07

Apparently no one has ever asked the Internet this, so let me be the first: are more black people skateboarding these days? Or is it just in my neighborhood?

I have posted before about Jezebel's "Write Like a Man" feature, but there's a newish episode up and it's kind of a doozy: an anonymous men's-magazine writer compares celebrity profiles in women's magazines with those in men's magazines, and argues that the mensmag ones are better because they're more meta about the total fraudulence and inadequacy of the celebrity-profile form. From the nutgraf, which comes almost exactly halfway through:

The modern mensmag celeb profile is actually a surprisingly prayerful, if superficial, blend of braggadocio and dogged practice.... The work of writing about celebrity is not real work. It's a break from the real work. It is The Writer's Time To Jizz -- a way to keep that writerly muscle loose and limber and tuned up for the next Big Plunge ... for that 14,000-word hillock of ASME-judge porn that all of us contract heroes have got sitting on our laptops. (Many of which, if we're being truthful, are nowhere near as playful or, in a weird way, honest as our best celeb pieces.) ... Each celeb profile becomes a little underdog story, an uplifting tale of a ragtag writer saddled with a task that Nobody Thought He Could Ever Pull Off: Can he spin a few hours' worth of smalltalk and smiles into a revolution?
There's 2,300 words of this stuff, with detailed examples from both sides of the fence. The next time someone claims that the Gawker Empire does nothing but cookie-cutter snark, point them here.

10/30/07

Man, it's fucking amateur hour.

10/26/07

The lines between real news and fake news continue to blur: At a FEMA press conference about the California wildfires, the reporters' role was taken by ... FEMA staffers.

Your questions about those two creepy-looking people on the box of Mastermind, answered at last.

10/25/07

Your recommended reading for today: Sam Anderson's funny review of How to Talk about Books You Haven't Read.

I got my back against the record machine: "So what happens when you’re Van Halen, the last song in your set list is the million-seller 'Jump' with its synthesizer-keyboard opening … and the recording you’re using to play back the synth is accidentally run at 48K instead of 44.1K?"

10/19/07

Does Seann William Scott think it's weird that he's been typecast as the guy who someone wants to fuck his mom?

Update: Nate Fisher, the Connecticut teacher who was fired for giving a student a copy of Eightball, won't be prosecuted.

"In a retouching feat worthy of the great Stalinist purges, Dan DeCarlo has been expunged from the institutional memory of Archie Comics!"

A couple Springsteen links from that Carl Wilson piece linked below (which you should go read now, if you haven't): a nice essay on the resurgence of Bossmania from the Toronto Star, and, uh, footage of Bruce onstage last Sunday. With Win and Régine of Arcade Fire. Doing "Keep the Car Running." Holy fuck.

10/18/07

Beyond the pale

Sasha Frere-Jones was an interestng and useful choice to be the New Yorker's music critic. It would have been easy to imagine the magazine running a column that served as an extension of the Nonesuch/Starbucks/KCRW "music for grownups" movement, reviewing new releases by Norah Jones and Elvis Costello and Wilco as though that were all there was to know about popular music in the 21st century. Instead, SFJ explicates country and crunk and mashups and Mariah Carey for the curious general reader. It is not inevitable that the New Yorker would include writing about these kinds of music; SFJ only makes it seem that way.

But he has this one hobbyhorse, and it is called indie rock.

The idea that indie rock abjures those aspects of rock 'n' roll that derive most directly from black musical forms is neither new nor exceptionable, and the story of indie rock's move away from blackness could be told in a non-pejorative way. In its classic form, indie rock is played on straight downbeats rather than with syncopation (compare Kim Deal's basslines to Keith Richards's guitar parts). Indie singers and guitarists typically don't flatten the third, fifth, and seventh notes of the scale in imitation of the blues. ("In the Velvets we had a rule," Lou Reed once said. "Anyone who played a blue note would be fined.") And archetypal indie bands don't groove -- they don't generate rhythmic tension from the interplay of disparate elements, in the manner pioneered by James Brown (although there are important exceptions here, which we'll get to).

This tendency doesn't begin with what we think of as modern indie, and it certainly didn't start in the '90s. Musically it's the heritage of punk. Perhaps SFJ doesn't want to blame punk; he dodges the question by focussing on the Clash, who blended "pure" punk elements with rootsier sounds like reggae and soul. But the Clash were the exception. Most punk bands, from the Sex Pistols and the Ramones on down, never made records like Sandinista; they never opened their aggressive straight-ahead rhythms to anything swingier. Punk's relationship to black-derived rock 'n' roll is best captured in the Pistols' recording of "Johnny B. Goode": the band tears through the twelve-bar changes while Johnny Rotten complains, "I hate songs like this!"

"Songs like this" -- blues structures, shuffling beats -- had been the template for rock 'n' roll since its inception, and they no longer signified in the way they once had. "Johnny B. Goode" sounded fresh and exciting in 1958 (in part because of they ways it crossed racial boundaries: a countryish narrative set to blues changes, sung by a black man with such precise diction that many listeners believed he was white, accompanying himself on guitar in a style derived from Delta bluesmen), but in 1977 it sounded like your dad's music, a story you've heard a million times from a war that took place before you were born. Punk made rock sound exciting again, and it did so by stripping away the derivative mannerisms, the reflexive note-bending and self-satisfied riffage that had accumulated over thirty years of white men playing black-derived music. Indie rock grew from that fresh start.

Some of the music that grew out of punk restored certain black elements. Talking Heads laid anxious, nerdy vocals over jerky James Brown grooves to make music that was both danceable and ironic; then they found their way through the irony to a kind of transcendence by routing around African American musical forms to straight-up African ones. Gang of Four did something similar with expressly political ends. (This strategy largely disappeared in the 1990s and then returned a few years ago.) Other bands departed from the punk sound without reverting to blues scales or dance beats. They made music that was verdant and mysterious like R.E.M., or dreamy and textured like My Bloody Valentine, or melodically rich like the Shins, or goofy and elusive like Pavement, or idiosyncratically expressive like Neutral Milk Hotel or Yo La Tengo or Radiohead, rather than physical and rhythmic. SFJ names two good reasons why they chose to do this: white musicians became self-conscious about their borrowings, and black musicians gained access to mass media. There's one reason he leaves out: for indie bands, making music that way felt more authentic and expressive, less like regurgitating the received wisdom. If Sasha Frere-Jones finds their music polite and precious and lacking in vigor, that's his right. But I wish he didn't let a narrow and rather arbitrary personal aesthetic (James Brown-style syncopation = good; Beach Boys-style harmonies = bad) get in the way of a useful historical argument.

Update: SFJ touches on some of this stuff in two blog posts. From the second:

Indie bands had good reason to look for uncolonized territory—that’s how art moves, how it lives. A less rosy interpretation is that if indie rock is rooted, at some level, in punk, then this re-sorting was preordained. Johnny Ramone effectively subtracted the blues from rock and roll, and that ideology may have attached itself to the entire project. Maybe the Clash and the Minutemen are exceptions in a long process of establishing a popular music that is structurally determined to escape the blues and its offspring.
Update 2: I have company, according to Slate's Carl Wilson, in a piece densely packed with good points:
Many commentators have pointed out his article's basic problems of consistency and accuracy: ... the conscious and iconoclastic excision of blues-rock from "underground" rock goes back to the '70s and '80s origins of American punk and especially hardcore, from which indie complicatedly evolved.

10/17/07

So much to read! Vanessa Grigoriadis's NYMag feature on Gawker is extremely chewy and satisfying. Margaret Talbot on The Wire is like a big birthday present for me and probably you. And of course it is necessary to say something about Sasha Frere-Jones's takedown of indie rock, which I will do in a little while.

10/13/07

An unusual moment of confusion from the NYT's estimable Jon Pareles: "Even at 160 kilobits per second (Kbps), In Rainbows is a sonic notch above the standard 128 Kbps iTunes download, and on a portable MP3 player through good earphones, it has plenty of detail."

But when it comes to measuring the fidelity of compressed digital audio, bit rate is not the only relevant criterion. Other things being equal, files compressed at 160 Kbps definitely sound superior to those compressed at 128 Kbps. But other things are not equal. The In Rainbows tracks are in MP3 format, whereas iTunes tracks are compressed using the superior AAC codec. To my ears (and I'm not alone), AACs at 128 Kbps are sonically equivalent to MP3s at around 192 Kbps. I am surprised that Pareles doesn't understand this, given how smart he is.

10/9/07

Profiles in courage: So the Democrats are caving to Bush on his illegal wiretaps. But don't call them craven, spineless cowards! For one thing, that's redundant. For another, it's inaccurate. The Dems are actually drawing a firm line: they'll extend the NSA's eavesdropping authority for several years, but they won't legalize it permanently. No way! And you can just quit asking, you Republican bullies!

A senior Democratic aide said House leaders are working hard to make sure the administration does not succeed in pushing through a bill that would make permanent all the powers it secured in August for the N.S.A. “That’s what we’re trying to avoid,” the aide said. “We have that concern too.”
So what the Democrats are saying is this: OK, we're petrified that someone's going to run an ad calling us weak on terror -- so petrified that we'll do anything the president wants us to. But just wait a couple years, and then see if we've grown a pair of testicles!

This NYRB piece on Gordon Brown put me in a really good mood.

The positive view of Brown was set abnormally early. He had been in Number Ten for about thirty-six hours when a car bomb was discovered in London's West End, followed by a failed attack on Glasgow airport. There was no sign of panic. Brown did not rush before the cameras insisting that he was taking personal charge or proclaiming a struggle for civilization, as his predecessor might have done. Instead he had his home secretary, Jacqui Smith, report to the public, making good on his promise to replace the presidentialism of Blair with a return to cabinet government.

When he did comment, following the Glasgow attack, he did so plainly and soberly as if discussing a serious crime rather than an act of war. This fitted Brown's disavowal of the phrase "war on terror," which he believes grants too much status, even dignity, to the murderers of al-Qaeda. The new approach, which instantly took the heat out of the moment, spreading calm rather than panic, won universal plaudits, including from Britain's Muslim communities....

Nowhere was the shift more apparent than in his relationship with the Bush administration. Brown used his first visit to the US in July to signal, by means subtle and overt, that a change had come.... Gone were the chinos, first names, and chummy informality of the Bush–Blair summits. At Brown's request, prime minister and president wore suits and addressed each other formally. Brown wanted to convey that the relationship from now on would be strictly business. Brown's inability to make smalltalk underlined that he did not want to be Bush's buddy and that the "special relationship" would be between Britain and the US rather than between Number Ten and the White House. As one of Brown's allies remarked later: "It was fascinating to watch Gordon turn his pathologies into assets."

10/4/07

Copy-editing the Iraq War, second in a series: Here's what occurred to me reading Gawker today: It's time to stop talking about 3,000 US troops dead and start talking about nearly 4,000.

10/3/07

Sam Harris calls for an end to the term atheism:

Attaching a label to something carries real liabilities, especially if the thing you are naming isn’t really a thing at all. And atheism, I would argue, is not a thing. It is not a philosophy, just as “non-racism” is not one. Atheism is not a worldview—and yet most people imagine it to be one and attack it as such.... Why should we stand obediently in the space provided, in the space carved out by the conceptual scheme of theistic religion? It’s as though, before the debate even begins, our opponents draw the chalk-outline of a dead man on the sidewalk, and we just walk up and lie down in it.

The Pear Cable company is offering a new line of 12-foot audio cables for $7,250. Dave Clark, editor of audiophile site Positive Feedback Online, describes them as "way better than anything I have heard ... very danceable cables." Professional skeptic James Randi offers Clark one million dollars if he can identify the cables in a blind test. Clark hasn't responded.

10/2/07

Return of crazy Wikipedia stuff: A sitcom called Heil Honey I'm Home!, in which Adolph Hitler and Eva Braun have to cope with their new Jewish neighbors -- who could possibly have guessed that this would be canceled after one episode? "The plot of episode 1 involved Adolf telling Eva of the impending arrival of Neville Chamberlain, and begging her not to tell the Goldensteins."

9/29/07

Hendrik Hertzberg has a nice theory on the collapse of the GOP's Steal California Initiative.

9/27/07

Am I wrong, or is GM's new agreement with the UAW a potentially lethal blow to the cause of universal health care? The Detroit automakers, staggering under the weight of employee health insurance costs, were expected to be big players in the lobbying for a universal coverage bill in a Democratic administration. If they've paid off 80 years of health costs upfront, as GM's new contract has it doing (Ford and Chrysler are expected to enter similar arrangements), they no longer have a stake in the government picking up those costs.

Maybe there's some provision in the agreement about what happens if government-subsidized health care expands. If anyone actually knows anything about this, please educate me in the comments.

Micheline Maynard's detailed story on the contracts doesn't mention the health care policy angle. It does, however, contain this bizarrely wrongheaded paragraph:

Likewise, U.A.W. members, assured of health care benefits that were the envy of the labor movement, had little incentive to take better care of their health, since their generous coverage would pay for most any ailment.
This is what's known as the moral hazard theory. It holds that, when people are insulated from a particular risk, they become less concerned about that risk: if I can buy flood insurance, I'm more likely to build a house on a flood plain. Moral hazard concerns were in the news recently when the Federal Reserve was considering whether to bail out investors hit by the subprime mortgage collapse. Every time the central bank protects investors from losses on risky bets, it's encouraging more risky bets in the future.

Maynard is applying the moral hazard argument to well-insured union autoworkers: their coverage is so good that they have "little incentive to take better care of their health." But this is ludicrous on its face. Health insurance can absorb the financial costs of ill health, but those are far from the only costs. If you're in constant pain, it's no great consolation that someone else pays for your treatments. If you're bedridden you want to get up and go outside, even if your insurance company pays for in-home care. If you learn you're going to die young you still mind, even if you know your family will receive a generous pension. Health care is unlike other economic goods, and treating it like them is one reason this country's health care system is so fucked up. It's surprising to see the Times, in a news story, get this confused.

9/24/07

Politico-linguistic intervention of the day: Listening to Terry Gross's meaty interview with WaPo's Thomas Ricks last week (web, iTunes), it struck me that serious, knowledgeable people have begun using the term ethnic cleansing to refer to what's going on in Iraqi cities and neighborhoods. Besides Ricks, the author of one of the most important books on the war so far, and Gross, who's usually pretty careful with her words, it's been all over the NYT in recent weeks, appearing in news stories and opinion pieces alike. David Brooks writes,

Second, the worst of the ethnic cleansing may be over. For years, Shiites and Sunnis have been purging each other from towns and neighborhoods.
But the problem, for once, is not Brooks's limited intelligence. Paul Krugman has this:
Oh, and by the way: Baghdad is undergoing ethnic cleansing, with Shiite militias driving Sunnis out of much of the city.
An early example is this Time piece headlined "Ethnic Cleansing in a Baghdad Neighborhood?"

The term ethnic cleansing originated, during the Balkan conflict, as a euphemism for genocide, often used by the English-speaking media with deliberate irony. It's a little strange to watch it turn into a legitimate, unironic term for mass displacement, especially since the word cleansing carries an ineradicable whiff of Nazi ideology.

But if we're going to use the term, can we at least use it accurately? Sunni and Shi'a are not ethnicities, they're religious denominations. For the most part, the people involved in homogenizing their neighborhoods, both Sunni and Shiite, are Arabs (although some are Turkmen, on both sides). The process is more properly called sectarian cleansing, or if you want to get really technical denominational cleansing, or if you want to lose the Nazi stuff sectarian homogenization. The best example of ethnic cleansing during the Iraq War is probably the expulsion of Arabs from Kirkuk, which seems to have calmed down a bit.

American ignorance about Iraq has done enough damage over the past five years; we shouldn't allow sloppy word choices to further cloud our understanding.

9/23/07

A Connecticut high school teacher named Nate Fisher lost his job after giving a 14-year-old female student an issue of Daniel Clowes's Eightball. (It's the one that was republished as Ice Haven, if that means anything to you.) Isn't that just the kind of thing the other Nate Fisher might have done in his twenties? NYMag and Publishers Weekly have commentary on their websites. The school's local TV station weighs in with the kind of insight that characterizes local TV news everywhere.

9/21/07

Slate joins the struggle to impose order on NYMag's Approval Matrix.

Kinsley, bless him, talks sense about the infamous MoveOn ad.

9/12/07

From The Economist:

Once upon a time, the only ideologically acceptable explanations of mental differences between men and women were cultural. Any biologist who dared to suggest in public that perhaps evolution might work differently on the sexes, and that this might perhaps result in some underlying neurological inequalities, was likely to get tarred and feathered.

Today, by contrast, biology tends to be an explanation of first resort in matters sexual. So it is salutary to come across an experiment which shows that a newly discovered difference which fits easily, at first sight, into the biological-determinism camp, actually does not belong there at all.

9/11/07

Weird: Joni Mitchell has a poem in the New Yorker. It's, um, not her best work:

We have poisoned everything
And oblivious to it all
The cell-phone zombies babble
Through the shopping malls

MoveOn's ad referring to General David Petraeus as "General Betray Us" is stupid for two reasons: (a) because hanging your argument on phonetic coincidence is puerile and reeks of powerlessness, and (b) because it gave the Weekly Standard an opportunity to run a story with the headline "MoveOn.org Calls Petraeus a Traitor; Do Democrats in Congress agree?"

HuffPo's Jeffrey Feldman argues that "the word 'betray' used by MoveOn in the ad implies many meanings, but does not directly imply 'traitor' -- unless that definition is introduced."

Unfortunately, it's a bit more complicated that. Merriam Webster gives two definitions for traitor:

1 : one who betrays another's trust or is false to an obligation or duty
2 : one who commits treason
What's happened here, obviously, is that the Standard is deliberately confusing the two meanings. It's fair to say that MoveOn was accusing Petraeus of being a traitor in sense one. But TWS would like for everyone to believe they accused him of being a traitor in the much more inflammatory sense two.

9/8/07

Who to trust? The NYT's Matt Zoller Seitz: "The new Halloween has sympathy for the Devil, but not enough." LA Weekly's Nathan Lee: "If anything, [Halloween director Rob] Zombie indulges too much sympathy for the devil."

9/7/07

Dumb quotes

This is an apostrophe:

It is used to represent letters (and occasionally digits) that have been omitted from contractions such as don’t and Li’l Abner and rock ’n’ roll and the ’80s.

This is a pair of single quotation marks:
‘ ’
They are used to indictate speech-within-speech, as in “I'm a reasonable man,” said David Brooks, “but when someone calls me ‘that fucking moron’ I get a bit upset.”

No one found this confusing until Microsoft Word introduced the “smart quotes” feature. When used with double quotation marks, smart quotes is a useful tool: you type the double-quote character and the software determines whether you want an open-quote mark or a close-quote mark, depending on whether the character comes at the beginning of a word or at the end.

The mistake was to implement smart quotes for single quotation marks as well. Say I want to write cookies and cream, but I want to write it in a cooler, jazzier way, by replacing the first and last letters of the word and with apostrophes. I type

C O O K I E S <space> <apostrophe> N <apostrophe> <space> C R E A M.
Microsoft Word sees the first apostrophe, observes that it comes after a space and before a letter, and decides that it's an opening single quotation mark. The program displays
cookies ‘n’ cream
as though I wanted to cast doubt on the letter n. Calling such a feature “smart” is a bit of a misnomer, I think.

The problem is compounded by the human tendency to trust computers too much. People see the quotation marks in cookies ‘n’ cream and think, Well, the computer put them that way, that must be right.

Taken to an extreme, people will even allow Microsoft Word's stupidity to fuck up the logo for a major Hollywood motion picture.



9/6/07

Another James Wood piece, this one from the Boston Globe. Wood's rep seems to have congealed around the fact that he sometimes criticizes books by famous and admired writers. As a corrective to this unfortunately reductive idea, see his reviews of McEwan's Saturday and Hollinghurst's The Line of Beauty.

9/5/07

We're back! And stealing things straight from Gawker! Check the photo on this NYT story, and then look at the caption, and then scroll down to the very bottom and read the correction.

8/30/07

So yeah, pretty cool about Zack's new job.

8/17/07

Friendblogging: The trailer for Jake's new movie, Walk Hard: The Dewey Cox Story, starring John C. Reilly.

8/16/07

Joshua Green's Atlantic cover story on "the Rove presidency" came out just before Rove called it quits, but it's probably going to be the ur-text on Rove-in-the-White-House -- a deep, intelligent, well-reported look at how the guy who Republicans and Democrats alike had annointed the greatest political genius of his time managed to screw the pooch so royally. The piece is behind the Atlantic's subscriber wall, but you can still read it -- pages one, two, three, four, five, and six -- thanks to the magic of Google cacheing. [Update: Some of the cached pages are gone, unfortunately.]

One suprise is the parallel between Rove and the Hillary Clinton who botched universal health care in 1993. Both tried to push their own political ambitions through Congress without properly deferring to the congressional leadership, and both thus lost the ability to pass ambitious legislation even when both houses were held by their own party. Something about winning the presidency apparently makes people think they get to call all the shots.

8/15/07

The Observer on James Wood's move to the New Yorker. Interesting fact: Wood has had a standing offer to work for David Remnick for many years.

Midweek readings

We wanted to do comedy that was about something, have the character articulate something about the baby-boomer generation that is now getting old and disconnected with the world. Nobody has properly articulated that.
Steve Coogan on Saxondale

The font is one of the oldest tricks in the book. You typeset text in a regular font, I think this was Rotis, and then you blow it up really big on a Xerox machine and then you shrink it down really small. The trick is to see just how much you can distress it and keep it readable. It's gotten harder to do because Xerox machines are so much better.
Chip Kidd on designing book jackets
for Amis, McCarthy, and Updike


Hansen and NBC News maintain that law enforcement and Dateline simply conduct “parallel investigations” that never influence each other. But by this afternoon, in front of Bill Conradt’s house, whatever wall may have once divided Dateline and the police has essentially collapsed.
Esquire on NBC's "To Catch a Predator"

8/11/07

Dick Cheney, 1994

8/10/07

Interview with Reading Comics author Douglas Wolk:

There were a bunch of chapters where I found myself going, “Dude, you’re talking about the story. Use your eyes. Don’t just read the words. Use your eyes, Douglas.” It’s something that because I’m such a word person that it’s hard for me to do, but I realize also that this is how comics work on my brain. This is how comics work on everybody’s brain. And it’s hard to talk about visual things in words in the same way that it’s hard to talk about music in words.

Maybe once a week I read something about Iraq and I think, This could be a storyline from an episode of The Wire. It's partly that the war in Iraq is perhaps the only national project as misguided in conception and inept in execution as the war on drugs. It's partly the repeated images of incompatible institutions butting up against one another, and of individuals within those institutions struggling and failing to affect the situation. And then, of course, there's all the stuff from The Wire that reminds me of Iraq. (Just one example: the conflict between Stringer and Avon that's the dramatic spine of season three is a conflict between a modern capitalist culture and a premodern "respect" culture, and the way it plays out as tragedy is a perfect mirror of our tragedy in the Middle East.)

Now we learn that David Simon and Ed Burns, the geniuses behind The Wire, are making a miniseries about the early months of the Iraq War.

It's hard when someone makes one of the great works of art, because then you want their next thing to be another one. And now that they've found the perfect subject ... I can only be disappointed by this, really.

It's based on this book, which I haven't read. The bad news is that it's only seven episodes. My big hope is that it'll become open-ended, and that, just as The Wire grew from a show about cops and gangs to take in the longshore union, City Hall, the school district, and soon the media, the new show, which starts from the perspective of a Marine battalion, will incorporate ordinary Iraqis, militia fighters, Al Qaeda in Mesopotamia, the Coalition Provisional Authority, the Iraqi Parliament, the Kurdish peshmerga, reporters with both the western press and Al Jazeera ... like I say, I can only be disappointed.

8/9/07

My #1 intellectual hero James Wood is leaving the New Republic for the New Yorker. I have wondered when this was going to happen. Political correspondent Ryan Lizza made the same move a month ago, which is maybe what's behind this comment from Leon Wieseltier: "The New Republic plays many significant roles in American culture, and one of them is to find and to develop writers with whom the New Yorker can eventually staff itself.” Meow!

8/7/07

Dahlia Lithwick makes a good point about the craven Democrats who just signed off on Bush's wiretapping program: they spent six months blasting Alberto Gonzales for every kind of ineptitude, then decided to let him violate the civil liberties of everyone in the country.

There is virtually no way to reconcile Sen. Mark Pryor's, D-Ark., claim that Gonzales has "lied to the Senate" and needs to go with his vote to expand the reach of our warrantless eavesdropping program. And how can one possibly square Sen. Dianne Feinstein's, D-Calif., claim that the AG "just doesn't tell the truth" with her vote to give him yet more unchecked authority?

A while back I requested a new front-end for the ugly and cumbersome Azureus, which I described as "the only Bittorrent client that meets my (very reasonable) needs: OSX-compatible, allows partial downloading, handles things like tracker announcements and protocol encryption properly." Happily, that's no longer true. With today's release of version 0.80, Transmission is now capable of partial downloading. It's handsome and usable and not written in Java, and thus leaps into position as the OSX BT client of choice. Just thought you'd like to know.

8/5/07

It turns out that Fake Steve Jobs is a senior editor at Forbes named Daniel Lyons. Surprisingly, this was broken by the NYT rather than some obsessive tech blogger. Maybe we do need old media after all.

Update: FSJ himself says the same thing. Also, on the NYT's tech blog, Brad Stone (who broke the story) asks "Are you happy that the mystery has been solved? Or did we just ruin the fun for everyone?" In the comments, 21 out of 23 commenters pick the latter. "Ruined it completely. Sux big time!" writes MS. Obviously, this is a biased sample set but these folks are, not to put too fine a point on it, total morons. Dennis O'Connor takes the prize for perverse logic with:

Regardless of your infantile need to expose FSJ, we will continue to enjoy his comments if he chooses to continue. He should quit and let the scorn of thousands be heaped around your ears for ruining a good thing.
But he has some stiff competition from Matthew J, who says:
with all of the real news that needs to be slethed by a talented reporter such as yourself, isn’t it more than a little sophmoric to cover this at your paper AND, at the same time, ruin a perfectly good bit of sport?
(Um, how is this ruining a bit of sport rather than participating in it and winning?)

I sympathize with these morons on one point: it was kind of neat when FSJ was anonymous, because you could pretend he was a real person, like e.g. the Earth-Two version of Steve Jobs or something. And now we know he's a fictional construct, created by a guy who happens to have a vendetta against the open-source movement. That's kind of a shame, because the pleasure of FSJ is the plausibility of its insights into Steve Jobs's head. I had thought, Yeah, I bet Steve Jobs really does think that the Free Software people are losers. And I still think he probably does, but the fun of speculation is dampened by the fact that this is obviously the author's POV too.

Still, that same observation reveals something interesting: it makes sense that a guy who engages in a long-term ventriloquism project like this one, who spends more than a year thinking "What might Steve Jobs have to say today?", will wind up writing about the topics that interest him, even if he does so through the point of view of his subject. Like if I decided to write a blog in the voice of Fake Steve Martin or Fake Stevie Wonder or Fake Stephen Hawking, I'd end up writing about that fake person's perspective on comic books and Apple. Something like this happens in most fiction, I suspect, although I have so far kept references to comics and Apple to a minimum in my own novel-in-progress.

Plus more: Daniel Lyons's personal blog, the one in his own voice that mostly covers open-source shenanigans, is a funny and interesting window into a world about which I know very little. Most of it is straight reporting/opinionizing, but here's a satirical entry that could have appeared word for word on FSJ.

8/2/07

Surging behind

Remember the surge? The idea behind the surge was that if we pump some more troops into Iraq to jack up security for a few precious months, maybe this will provide enough breathing space for the Iraqi government to hammer out a deal. It was always a hail Mary, a plan forged out of desperation -- the kind of thing that builds up tension at the climax of a movie but that typically fails miserably in real life.

So let's check in and see how it's working. Question one: have the troop increases led to some measurable increases in security for Iraqis? There is some dispute about this, even within Michael O'Hanlon's brain. So let's give the surge the benefit of the doubt and say it is indeed making Baghdad and Iraq as a whole more secure.

Now we turn to question two: Is the bump in security paying off? I.e., is the Iraqi government responding to the improved circumstances by moving toward a settlement on the tough issues (oil-revenue sharing, de-de-Baathification, federalization)?

Well, no. In fact, the Iraqi government has responded to the surge by falling apart.

This would seem to eliminate the surge's entire rationale, no? Even if there is a rapprochement with the Sunni Accordance Front (anything's possible), it's obvious that political reconciliation in Iraq is moving backward rather than forward.

So how does defense secretary Bob Gates explain this mess? “We probably all underestimated the depth of the mistrust and how difficult it would be for these guys to come together on legislation."

Oh Jesus H. Fuck, it's greeted-as-liberators all over again.

Still, now that Gates has acknowledged the error, presumably we're going to call off the surge and start figuring out how to wind this thing down with as few additional corpses as possible, right? After all, the evidence is in: temporary "breathing room" can't bring Iraq's warring factions to the table.

Nope.

Mr. Gates offered a slightly different formulation on Thursday, arguing that political progress would come when Iraqi Army and police units proved able to take over primary responsibility for maintaining security in areas now largely controlled by American troops.

“I think the key is, not only establishing the security, but being able to hold on to those areas and for Iraqi Army and police to be able to provide the continuity of security over time,” he said. “It’s under that umbrella I think progress will be made at the national level.”

Bear with me, because I'm about to make a sports analogy, and we all know that's not my strong suit. But this is what's called moving the goalposts.

With the speed and alacrity characteristic of the U.S. military, commanders will be reviewing the surge strategy in September, at which point, Gates said (in the words of the NYT), "the administration would have to balance the relative lack of political progress with the somewhat encouraging security trends."

Since that last sports analogy seemed to go OK, I'm going to try something more ambitious: this is like saying that, in assessing a football strategy, we'll have to balance the fact that we didn't score any points and gave up three touchdowns with the fact that we did some really strong blocking. The security trends are only relevant inasmuch as they enable political progress. If the surge could be sustained indefinitely, you could argue that the security improvements benefit Iraqi civilians, the people who have suffered for all our blunders, so better security is a good thing on its own merits. But we can't sustain current troop levels too far into next year, no matter how much we lower recruiting standards/extend the tours of exhausted soldiers/starve commanders in Afghanistan of manpower.

As of now, the surge makes no sense as a military strategy. Until yesterday it had a logic, however optimistic. But now that logic is exhausted, and yet the surge continues.

The only possible conclusion is that this is happening because the surge is not a military strategy at all -- it's a narrative one. It's a way to keep a tired show on the air one more season, like an adorable kid cousin or a Very Special Wedding Episode. The surge is the moment when the Iraq War jumped the shark. Can we please, please cancel it?

Why it's not in Bush's interest to fire Alberto Gonzales.

Haven't posted anything heartbreaking about Iraq recently, so as the Iraqi government -- and with it the rationale for Bush's troop surge -- collapses, I turn as usual to the New Yorker's George Packer, who writes:

After ten days or so, Omer went with a friend to look for his father at the morgue and found a scene of absolute hell. Bodies were stacked two or three high in the hallways, with no refrigeration, the older corpses beginning to decompose and generate maggots. Holding hands, Omer and his friend examined body after body until they found one that had been shot in the torso and might have been his father; they couldn’t be sure. Morgue officials led them to a room where a few dozen Iraqis, many of them women, were staring at six computer monitors. The screens showed a picture of one corpse’s face for a few seconds, then flashed the next face. Now and then, someone in the room would begin to wail. This was the closest thing to “closure” and dignity in death that the victims’ families could expect. Suddenly, the face of Omer’s father appeared on all six screens.

Butler, desnarked

Pulitzer-winning oversharer Robert Olen Butler has sent a lengthy response to Gawker about his loony e-mail re: his wife and Ted Turner. Gawker posted Butler's e-mail sliced into little chunks, with funny/snarky commentary in between. For those who want the soap opera without the commentary, I've put the pieces back together below.

Subject: Can you please give voice to this at your site?

I am sure there are a number of your followers who actually might want to understand this intense letter which was written in an extreme emotional circumstance. They encountered the email with no knowledge of two of the three principal players in the drama. They have only a sound-bite-and-media-spun understanding of the third. I can well see how a first reaction to the email by someone for whom it was not intended might be that it is only a bizarre and inappropriate document worthy of scorn.

But to begin to see the email in a fair way, you must understand this premise: I loved Elizabeth deeply for 13 years. I did not stop loving her when she told me what was happening between her and Ted. I love her still in an altered but sincere way. She loved me. She loves me still, but no longer as her husband. I'm sure many, if not all, of your readers have gone through their own dramas of love and loss. Love is not easily relinquished and it can shift its shape.

My drama of love and loss was particularly intense and had some strikingly unique characteristics. And it presented only a small range of choices, none of them good. In terms of the inevitable news of all this, my primary concern, of course, was with the community she and I lived in. If I had said nothing, the naked facts of the events would have meant that Elizabeth would be savaged by the rumor mill.

Even with the facts of her terrible childhood before them, some of the commenters on this and other forums are saying terrible and cruelly untrue things about her character. With no mitigating interpretation at all offered about what happened in our lives and in our marriage, you can well imagine how much worse the reaction would have been. It's just human nature. Nor would very simple, broad-outline public pronouncements have made any difference. If I had simply said something to the effect of "they're marrying for love and she and I will remain friends and I wish them well," it would not have been believed and the very same false assessment of her would have occurred. The explanation vacuum--even a partial one--especially given Ted Turner's involvement--would have been filled in a way that would have been unfairly critical of Elizabeth. Remember, I'm talking about the circle of our friends and acquaintances and colleagues here. Those were the people I had to focus on, not the wide general public. I never dreamed you all would get this intimately involved.

Either of those two choices--silence or vagueness--would have been the easy way out for me. I had nothing to gain from the letter I wrote unless it was a covert act of rage, an act of passive aggression. It was not. Your readers may not believe that. But my wife and I have warmly and lovingly spoken on the phone virtually every day since the breakup. We are going through this crisis of publicity together in a loving way. She is the one person in the world--the only one other than myself--who can judge if I am raging and aggressive over her. When I said in the email that she knew about, endorsed, and even encouraged the email, that was literally true. I showed the entire email to her before I sent it. She could have said not to do it. She could have significantly altered it. She did not. She made a few suggestions, which I implemented.

And the email was never a mass email. I chose five trusted grad students who know us both the best. I chose half a dozen faculty members who know us both the best. And they were asked, when the rumors reached them, to tell the appropriately nuanced story. Or to tell the fuller story on their own initiative--because everyone would soon know anyway. Yes, I sanctioned the use of the email I sent them in order to explain the circumstances to the people in our community who were hearing about this. Why should I avoid vagueness myself and then force them to be vague? Without that sanction to use the email, the explanation vacuum would have continued to form and be filled with lies. And this process worked exactly as I had hoped. That email went out six weeks ago. And faculty members and students alike have told me that all of the talk around campus and around town has been sympathetic and generous about both of us.

Now as to the intimate nature of the email, this is crucial to understand: there is not a single fact of Elizabeth's or Ted's or my personal lives that the intended audience could not easily have already known. Elizabeth has spoken and written openly, publicly, about everything in her childhood. Ted's persona and the details of the pattern of his love life are widely known (just read Jane Fonda's memoir). I do connect some dots to try to explain why Elizabeth has been drawn to him. But it was not meant to be a judgment against either of them. Ted's own difficult childhood is also public knowledge. We all of us often--some psychologists would say pretty much always--form adult relationships as an acting out of the basic love patterns of childhood relationships. There is nothing unseemly or wrong about this. It is the human condition.

And I tell you absolutely that Elizabeth did not do this for money and Ted did not do it lightly as conquest. They love each other deeply. And given what they've both been through in their lives, I expect them to be very good for each other. I love Elizabeth and her remarkable writing talent. I admire the wide-ranging good works Ted does to preserve the earth and prevent nuclear war. These are admirable people doing important work in the culture and in the world. I sincerely hope they have the rich happiness they deserve.

In spite of my previous chiding of you and your readers, I wish that happiness for all of you, as well. It's dangerous to live too deeply in a world of glib judgmentalism. And man, there is some truly legitimate short-burst writing talent among you all. But I hope at least some of you come to realize that vituperation, no matter how funny or elegantly expressed, is not an art form. Because some of you may well be capable of turning your talent with language--and your ferocious sense of right and wrong--to a more enduring purpose: to exploring, with courage and frankness and humor and compassion and moral insight, the truths of the human heart.