Showing posts with label tech. Show all posts
Showing posts with label tech. Show all posts

11/3/08

Spore wrapup: Seed article on the conflict among the game's developers: the "cute team" versus the "science team." On the Spore messageboard, disappointed fans react with a thread called "We Found Who to Tar and Feather!" Biologist P. Z. Myers bemoans the dumbed-down science.

9/20/08

Random bashing of traditional targets as respite from DFW grief part 1: Mary -Jo Foley reports that Steve Ballmer's going to stay at Microsoft longer than he once claimed: "According to scuttelbutt [sic] from Microsoft’s annual employee meeting, which was held in Seattle on September 18, Ballmer told attendees that he is going to stay on at Microsoft until Microsoft’s search share exceeds Google’s." Why not just say until hell freezes over?

5/24/08

Cool LAT piece on Inara George's new record with Van Dyke Parks. Beyond the article's inherent interest, it also reveals how bad the LAT website's contextual referral software is. Inara says, with regard to her many side projects, "I think you just make sure you say yes to the things that you really want to do and no to the things that feel they're extra fat," so the website offers a link to "How to build six-pack abs: For that ripped, lean look, you need strength training -- and very little body fat in targeted areas." And producer Mike Andrews charmingly compares Inara's music to "beachfront real estate on an island that no one's ever visited," which prompts the suggestion, "America's best beaches: Dr. Beach ranks the country's top 10 beaches." Someone build a metaphor-recognition algorithm in there, quick!

4/30/08

Former Microsoftie Joel Spolsky on "Windows Live Mesh" (that's the real name of the service; it's just so stupid that I had to apply scare quotes as a prophylactic):

I shouldn't really care. What Microsoft's shareholders want to waste their money building, instead of earning nice dividends from two or three fabulous monopolies, is no business of mine. I'm not a shareholder. It sort of bothers me, intellectually, that there are these people running around acting like they're building the next great thing who keep serving us the same exact TV dinner that I didn't want in Sunday night, and I didn't want it when you tried to serve it again Monday night, and you crunched it up and mixed in some cheese and I didn't eat that Tuesday night, and here it is Wednesday and you've rebuilt the whole goddamn TV dinner industry from the ground up and you're giving me 1955 salisbury steak that I just DON'T WANT.

4/1/08

If you made it through my "Disagreeing with Paul Graham," you might like to know that there's a lively comments thread about it on Graham's social news site, Hacker News. Graham responds here; I respond to his response here. Also:

This reads like Mister Spock's review of a punk rock concert.

"I don't understand why the audience was asked if it was 'ready to rock', as they clearly did not have instruments. Also, it was illogical to ask them to 'fight the power', since power is an abstract physical quantity that cannot meaningfully be 'fought'."

3/29/08

Disagreeing with Paul Graham

Paul Graham has provided a primer on types of disagreement, from name-calling through ad hominem to refutation. This is helpful, because I disagree with something he wrote recently, and I want to do it properly.

(Interjection for non-initiates: Paul Graham is a computer programmer who sold a startup to Yahoo in 1998. Now he's an investor with Y Combinator, which provides seed funding for startups. I recommend his essays on programming to curious laypeople; here are three good ones.)

I'm certainly not qualified to disagree with Graham on the subjects of computers, programming languages, or startup companies. (Graham would accuse me of directing an ad hominem attack against myself: "Saying that an author lacks the authority to write about a topic is a variant of ad hominem—and a particularly useless sort, because good ideas often come from outsiders." But this case points to one of the legitimate uses of ad hominem judgments: as a timesaving filter. The number of possible criticisms of anything is infinite, and we need quick ways to weed some of them out. One good way to do that is to ignore arguments from people who are patently unqualified to speak on a subject. This is how physics professors avoid wasting their lives refuting specious arguments from cranks. Using ad hominem judgments this way doesn't really count as disagreeing, though.)

I'm going to disagree with an essay on a more general topic: "You Weren't Meant to Have a Boss."

"The most powerful form of disagreement is to refute someone's central point," Graham tells us. "And that means one has to commit explicitly to what the central point is." The central point of "You Weren't Meant to Have a Boss" is, "It will always suck to work for large organizations, and the larger the organization, the more it will suck," because "humans weren't meant to work in such large groups."

How does Graham back up this contention? With direct observation, and with an argument from evolution.

Direct observation first: Graham saw some big-company programmers in a café, and they looked less alive than the startup founders he works with. (He doesn't directly say they look less alive; he says it indirectly: "Lions in the wild seem about ten times more alive [than lions in a zoo]. And seeing those guys on their scavenger hunt was like seeing lions in a zoo after spending several years watching them in the wild.")

One problem with direct observation is that it's hard to get a representative sample. It can be done, but you have to make it a deliberate project and put some time and thought into it, rather than just stumbling on a bunch of programmers in a café. And an unrepresentative sample is prone to distortion. There's no way to tell what's signal and what's noise. The programmers Graham saw in Palo Alto might all have a boss who's an asshole. They might be working on a particularly boring project. Maybe the correct conclusion to draw from them is "People who work for an asshole look less alive," or "People who work on boring projects look less alive."

Another problem with direct observation is that humans have a tendency to see their biases confirmed everywhere they look. (That's the definition of a bias: something that keeps you from seeing straight.) Graham is a startup founder and an advocate for the founding of startups. He goes to a café, and he sees some non-founders, and he thinks they look less alive than the founders he knows. The fact that this observation confirms his preexisting belief in the worthiness of startups makes it less credible than if the same observation were made by someone who had no particular interest in startups.

(This is a case where an ad hominem argument is relevant and valid. When a writer uses his own observations as evidence, it's legitimate to question his observational ability. It's the equivalent of me saying, "This elephant only weighs 20lb," and you saying, "The scale you're using to weigh it is broken.")

Besdies his experience in the café, Graham cites unnamed written sources and unspecified personal experiences, which he uses to bring evolution into his argument: "what I've read about hunter-gatherers accords with research on organizations and my own experience to suggest roughly what the ideal size is: groups of 8 work well; by 20 they're getting hard to manage; and a group of 50 is really unwieldy." It's notable how thin this citation is. But let's stipulate that his reading is correct, and that our forebears did their hunting and gathering in groups of eight or so.

When you break it down, Graham's argument from evolution goes like this: our ancestors worked in groups of eight or so, therefore humans evolved to work in groups of eight or so, therefore contemporary humans will be more alive and fulfilled working in groups of eight or so.

In a general sense, this is the logical fallacy known as the "appeal to nature" -- the idea that what's natural is ipso facto good or right. There is no reason to believe this: plenty of natural things are neither good nor right.

More specifically, it's a popular contemporary version of the appeal to nature: the idea that living in ways that fit with our evolutionary design will make us happy. (Graham describes startup founders and wild lions as "both more worried and happier at the same time.") This is a superficially convincing notion, but there's no reason to think it's true. Evolution has no particular interest in our happiness. A creature that's perpetually dissatisfied, always striving for advantage, wins out over a creature that's happy. (This is why Buddhist monks, who try to eliminate striving and attain happiness, spend decades performing meditations that to most people seem unbearably tedious and effortful: they're trying to override their brains' natural tendencies to striving and unhappiness, and that takes a lot of work.) There's no reason to think that primitive hunter-gatherers were any happier than we are, and even less reason to imagine we'll be happier if we imitate their management practices.

But rather than pointing out fallacies, a better way to refute Graham's evolutionary argument is by reductio ad absurdum. His argument goes like this: our ancestors worked in groups of eight or so, therefore humans evolved to work in groups of eight or so, therefore contemporary humans will be more alive and fulfilled working in groups of eight or so. What else follows from that argument?

Well, our ancestors worked in hunting and gathering. They didn't work as computer programmers. Therefore humans evolved to work as hunter-gatherers; therefore contemporary humans are more alive when they're foraging for food than when they're programming computers. (Suggested title for Graham's next essay: "You Weren't Meant to Have a Chair.") Our ancestors lived in a world that was shrouded in darkness half the day, therefore we would be happier without electric lights.

One difficulty with disagreeing with people is that you have to present their argument and your argument, and so your essay ends up being longer than theirs.

3/2/08

Is Philip Roth having cybersex? Surely that's the implication of these remarks, from an interview with Der Spiegel:

SPIEGEL: You have email and don't use it?
Roth: I use it with one person, one person only, because I don't... I don't want to be bothered.
SPIEGEL: May we ask who the one person is?
Roth: One person. I have to have some fun.

2/1/08

If Microsoft gets its paws on Yahoo, it'll be a disaster for both companies. The reason Microsoft hasn't managed to build a decent online presence in, what, twelve years is not that there are no smart people there. It's that Microsoft is constrained by the need to protect its shrinkwrap software business. Windows and Office bring in maybe $50 billion a year (revenues), at absurd profit margins. Google wants to replace those megabucks with the (much smaller) ad revenues from cloud-computing services like Google Docs. Microsoft's efforts at competing are half-assed and hamstrung, because MS doesn't want successful network services. (This is also why MS uses its crappy web browser's huge market share, gained by exploiting a monopoly, to retard the development of web standards: they want the web to suck.)

If the deal goes through, Microsoft's interests become Yahoo's interests, and one of the first great web companies will be conscripted into a rearguard action against the web itself.

1/1/08

Tim O'Reilly draws an interesting analogy between financial markets and internet services:

One of the real wake-up calls was the way that Wall Street firms moved from being brokers to being active players "trading for their own account." ... Bill Janeway [points] out that ... "now, the direct investment activities of a firm like Goldman Sachs dwarf their activities on behalf of outside customers."
And sure enough, there is lots of evidence that this process is already far advanced [in web services]. These sites, once devoted to distributing attention to others, are increasingly focused on consuming as much of the user attention as possible. What else do you make of Google's recent sally against Wikipedia, the so-called knol....
As Google's growth slows, as inevitably it will, it will need to consume more and more of the web ecosystem, trading against its former suppliers, rather than distributing attention to them.
Update: Y'know, on reflection this is one of those analogies that make less sense the more you think about them. Like, Wikipedia isn't Google's customer. It's not Google's competitor, either. As O'Reilly himself says, it's a supplier. A better analogy would be when a car company decides to stop getting a certain part from a subcontractor and starts making it in-house. Which, when you look at it that way, big whoop for everyone but the subcontractor.

12/29/07

AOL is finally shooting the Netscape browser in the head. Former Netscape developer Asa Dotzler says good riddance.

12/14/07

Microsoft's PlaysForSure brand -- the logo that identifies whether a particular non-iTunes online music service will work with a particular non-iPod mp3 player -- has now been renamed Certified for Windows Vista. Even though it has nothing to do with Windows Vista. I guess they wanted to capitalize on all that successful Vista branding. Oh, wait.

As Ars Technica puts it:

Microsoft's PlaysForSure has always been a model of how to run a DRM ecosystem: launch a new scheme with logo, convince device makers to sign up, launch your own online music store that uses said ecosystem, drop your music store, launch your own device which uses incompatible DRM, launch new music store with same incompatible DRM, then change branding of ecosystem logo. On second thought, perhaps there's room for improvement here.
Gotta love the private sector -- it's so efficient and market-driven.

11/7/07

Predictable: Type "onion" into Google, and the first result is The Onion.
Less predictable: Type "the" into Google, and the first result is also The Onion.

[Via AlterNet]

11/4/07

"'Keyboard shortcuts are faster' is a myth" is a myth: In 1992, Tog wrote:

The test I did I did several years ago, frankly, I entered into for the express purpose of letting cursor keys win, just to prove they could in some cases be faster than the mouse. Using Microsoft Word on a Macintosh, I typed in a paragraph of text, then replaced every instance of an "e" with a vertical bar (|). The test subject's task was to replace every | with an "e." Just to make it even harder, the test subjects, when using the mouse, were forbidden to just drop the cursor to the right of the | and then use the delete key to get rid of it. Instead, they had to actually drag the mouse pointer across the one-pixel width of the character t o select it, then press the "e" key to replace it.

The average time for the cursor keys was 99.43 seconds, for the mouse, 50.22 seconds. I also asked the test subjects which method was faster, and to a person they reported that the cursor keys were much, much faster.
I have just duplicated Tog's experiment, also using Microsoft Word on a Macintosh. I used a 94-word sample and timed myself with Minuteur. Using the cursor keys took 93 seconds; using the mouse took 239 seconds.

Tog's research is at least 20 years old. It may have been relevant when keyboard shortcuts and computer users were both less advanced than they are now, but those days are gone. And yet the estimable John Gruber linked to Tog's column last week, as though it were something for contemporary users and developers to keep in mind. Someone cites it in a comments thread here. Squelch this revaunchist nonsense before it goes any further! Keyboard shortcuts work!

11/3/07

Just read (via DF) this 1989 article by Apple human interface guru Bruce "Tog" Tognazzini. In a nut:

We’ve done a cool $50 million of R & D on the Apple Human Interface. We discovered, among other things, two pertinent facts:

* Test subjects consistently report that keyboarding is faster than mousing.
* The stopwatch consistently proves mousing is faster than keyboarding.
This had a big impact on me. I've been a keyboard-shortcuts guy ever since my first job, where my boss would stand over my shoulder and correct me when she saw me reach for the mouse. Now the first thing I do in a new app is train myself to use the key commands, and I've created custom shortcuts in all the apps I use frequently (e.g. in Word, Command-Option-W for Word Count), and I use Quicksilver to launch apps, open files, search Google, send email, get lunch, basically everything. All of this keyboarding makes me feel very efficient. And now here's Tog himself bringing my world crashing down around me.

But when you think about it, it can't be as simple as Tog suggests. The blanket statement, "Mousing is faster than keyboarding" is, presumably, true in certain circumstances. But it can't be true always and everywhere.

I spend a lot of time writing in Word. (I know, I know, but I'm used to it.) I try to write 2,000 words a day, and although I don't always manage it I usually get close enough. Based on a random sample of my prose, that's about 10,865 characters. I enter almost every one of these 10,865 characters into a Word document using the keyboard. According to Tog I should be able to save time by finding the character in the Symbol dialogue box and clicking on it.



Maybe that's a facile example. Tog might say, "Of course, I didn't mean typing words. That's what a keyboard is for. I meant performing other actions."

So here's an example that's more on point: saving. While I'm writing my 2,000 words, I am a saving freak. I save my document reflexively. Whenever I'm not typing, I'm saving. I'm sure I take saving to a useless and neurotic extreme, but it's a harmless neurosis -- the computer can handle all that saving, and it removes a source of worry, and I never have those I just lost two hours' work things that happen to other people.

I do all this saving using the venerable Command-S. I did it just now, after typing that last sentence, autonomically: hands in the resting position, left thumb about an inch to the left (I'm left-handed), left ring finger down. Boom, saved. Not once do I think about the Command key or the S key, just as I don't think about the Shift key or the T key when I begin to type Tog.

I could, instead, use the mouse to go to the File menu's Save command, or to the Save button in the toolbar. I find it hard to believe that would be quicker, but perhaps I'm falling victim to Tog's first point and failing to accurately register the time it takes to hit Command-S. So let's abstractify a little. I can't say for sure how fast I am at hitting Command-S, but I'm definitely faster than I was when I started using a computer. I'm faster than the average computer user, just because I do it so often. Either the speed of mouse-saving is like the speed of light, and there's no way you can ever catch up with it, or at some point I'm going to be faster with the keyboard than with the mouse.

Abstractify one layer further: If a keyboard shortcut is used frequently enough, and the buttons used are convenient and memorable enough, and the mouse alternative is sufficiently complex (identify the Save button from all the buttons on the toolbar, find the cursor, land the cursor on the Save button, click, return hands to the keyboard), then the keyboard shortcut is quicker and less distracting. If I only saved once a day, and the shortcut was Control-Option-Y instead of Command-S, and Microsoft had made the Save toolbar button twice as big, and there were no other buttons next to it on the toolbar, then using the mouse would be quicker.

And what about more data-dense applications? When I'm editing audio in Pro Tools and I need to move my cursor to a particular spot, there are 44,100 possible cursor locations per second of audio and maybe five minutes of audio represented on my screen. I can try to find that spot with the mouse, using repeated clicks of the zoom button, recentering, squinting at the waveforms, then unzooming back to the original view. Or I can hit the Tab key and, using Pro Tools's Tab to Transient feature, allow the software to find the exact spot I need. Is the mouse quicker then?

Tog wrote his piece in 1989, the year the first version of Pro Tools (then known as Sound Tools, which is a better name) was released. He can't be blamed for not knowing about high-resolution audio or video editing. Still, one wonders about the $50 million worth of testing he did. Did he test on anyone who'd spent ten years hitting Command-S as often as I do?

In fact, the answer to the question Which is faster, keyboard or mouse? is not Tog's one-size-fits-all answer (the mouse, and testing proves it!), nor the answer of my old boss (the keyboard, and get your hands off that mouse!). It's For what user, attempting to accomplish what task, under what circumstances?

Update: This 2005 paper (PDF) from the International Journal of Human-Computer Interaction comes down squarely on the side of keyboard shortcuts.

8/7/07

A while back I requested a new front-end for the ugly and cumbersome Azureus, which I described as "the only Bittorrent client that meets my (very reasonable) needs: OSX-compatible, allows partial downloading, handles things like tracker announcements and protocol encryption properly." Happily, that's no longer true. With today's release of version 0.80, Transmission is now capable of partial downloading. It's handsome and usable and not written in Java, and thus leaps into position as the OSX BT client of choice. Just thought you'd like to know.

8/5/07

It turns out that Fake Steve Jobs is a senior editor at Forbes named Daniel Lyons. Surprisingly, this was broken by the NYT rather than some obsessive tech blogger. Maybe we do need old media after all.

Update: FSJ himself says the same thing. Also, on the NYT's tech blog, Brad Stone (who broke the story) asks "Are you happy that the mystery has been solved? Or did we just ruin the fun for everyone?" In the comments, 21 out of 23 commenters pick the latter. "Ruined it completely. Sux big time!" writes MS. Obviously, this is a biased sample set but these folks are, not to put too fine a point on it, total morons. Dennis O'Connor takes the prize for perverse logic with:

Regardless of your infantile need to expose FSJ, we will continue to enjoy his comments if he chooses to continue. He should quit and let the scorn of thousands be heaped around your ears for ruining a good thing.
But he has some stiff competition from Matthew J, who says:
with all of the real news that needs to be slethed by a talented reporter such as yourself, isn’t it more than a little sophmoric to cover this at your paper AND, at the same time, ruin a perfectly good bit of sport?
(Um, how is this ruining a bit of sport rather than participating in it and winning?)

I sympathize with these morons on one point: it was kind of neat when FSJ was anonymous, because you could pretend he was a real person, like e.g. the Earth-Two version of Steve Jobs or something. And now we know he's a fictional construct, created by a guy who happens to have a vendetta against the open-source movement. That's kind of a shame, because the pleasure of FSJ is the plausibility of its insights into Steve Jobs's head. I had thought, Yeah, I bet Steve Jobs really does think that the Free Software people are losers. And I still think he probably does, but the fun of speculation is dampened by the fact that this is obviously the author's POV too.

Still, that same observation reveals something interesting: it makes sense that a guy who engages in a long-term ventriloquism project like this one, who spends more than a year thinking "What might Steve Jobs have to say today?", will wind up writing about the topics that interest him, even if he does so through the point of view of his subject. Like if I decided to write a blog in the voice of Fake Steve Martin or Fake Stevie Wonder or Fake Stephen Hawking, I'd end up writing about that fake person's perspective on comic books and Apple. Something like this happens in most fiction, I suspect, although I have so far kept references to comics and Apple to a minimum in my own novel-in-progress.

Plus more: Daniel Lyons's personal blog, the one in his own voice that mostly covers open-source shenanigans, is a funny and interesting window into a world about which I know very little. Most of it is straight reporting/opinionizing, but here's a satirical entry that could have appeared word for word on FSJ.

7/25/07

If you haven't been reading Fake Steve Jobs lately, he's been on something of a roll. Here's Caroline McCarthy on the anonymous blogger's old-school charm:

In a culture captivated--obsessed, even--by the antics of high society, an anonymous satirist starts publishing over-the-top missives purporting to be from an insider in that privileged niche. In the process, the faux-mogul skewers political elites, entertainers, business titans, and ordinary people in a way that's at once outlandish and provocative, hilarious and appalling. It reeks of Swift or Dickens or Twain.
Plus: Andy Ihnatko forthrightly denies that he's FSJ:
I say this here and now, without a single wink or ironic note: I’m not him. I had nothing to do with the blog’s creation and have never had the slightest thing to do with any of its content.

6/25/07

Danah Boyd observes a burgeoning class schism between Facebook and MySpace:

The goodie two shoes, jocks, athletes, or other "good" kids are now going to Facebook. These kids tend to come from families who emphasize education and going to college. They are part of what we'd call hegemonic society. They are primarily white, but not exclusively. They are in honors classes, looking forward to the prom, and live in a world dictated by after school activities.

MySpace is still home for Latino/Hispanic teens, immigrant teens, "burnouts," "alternative kids," "art fags," punks, emos, goths, gangstas, queer kids, and other kids who didn't play into the dominant high school popularity paradigm. These are kids whose parents didn't go to college, who are expected to get a job when they finish high school. These are the teens who plan to go into the military immediately after schools. Teens who are really into music or in a band are also on MySpace. MySpace has most of the kids who are socially ostracized at school because they are geeks, freaks, or queers.

6/18/07

As a Mac user I mostly ignore Windows, but sometimes I stumble upon something that seems to sum up the entire Windows experience. Like this sentence, from a Lifehacker post: "That means the whole process of hunting down obscure error messages—especially those containing cryptic error codes—just got a whole lot easier."

6/11/07

Far from the tree

When I heard the Safari-for-Windows rumor, my response was, "It'll never happen -- Apple is a hardware company." Apple's business model is: (1) make objects; (2) sell them for a profit. They make OSX to sell Macs. They make iTunes (for Mac and PC) to sell iPods. How could Safari for Windows help their bottom line? If I had had more time yesterday I would have written a post to that effect, linked to Mary Jo Foley's blog, and looked like a moron.

So now I'm confused.

The main justification I've heard is that WinSafari is a kind of advertisement for OSX. As Engadget put it, "it seems the Apple folks plan to use it in much the same way they've used iTunes to grow the Mac fanbase by giving Windows users 'a glass of ice water to somebody in hell!'" In other words, Steve Jobs believes that PC users will try Safari and think, "This free browser is awesome -- now I'm going to spend $2,000 on a new computer to get other software that is presumably equally awesome." I find this hard to believe. Safari is a good browser, but it's not that much better than Firefox.

So what's Apple thinking?

My guess is that it has something to do with the new iPhone development standards that Jobs announced today. For those of you who don't follow this stuff as obsessively as I do: independent software developers (i.e. programmers who don't work for Apple) will be able to write programs for the iPhone, but those programs will be akin to "web apps" like Google Maps and Flickr -- they'll run in the iPhone's web browser, which (it so happens) is a version of Safari.

So what I'm thinking is this: there will be occasions when a developer wants to write a program that runs on both the iPhone and the desktop (e.g. a program that syncs data between your phone and your computer in some specialized way). For most purposes, the iPhone will integrate with your computer using iTunes, just like the iPod does. But these new iPhone programs can't run in iTunes, because iTunes doesn't run web apps.

If Apple wants to accomodate them, there's three choices: (a) build browser-type features into iTunes; (b) force developers to write apps that work on Firefox or Internet Explorer as well as Safari; (c) port Safari to Windows. Option (a) stretches the iTunes concept (already pretty elastic) past breaking point. Option (b) would have worked for a while, mostly, but it risks sticking developers with compatibility issues going forward, which might have been a brake on iPhone software development. Option (c) allows Apple to build special features into this or future versions of Safari, just for developers of iPhone software.

So that's my guess: that the version of Safari on your computer will integrate with the version on your iPhone in some way. Time will tell.