Friday, April 13, 2007

At the risk of turning this blog into a running xkcd commentary interrupted by intermittent Borges book reports -- hey, actually, that would make a really cool blog.

Anyway, xkcd's latest webcomic depicts a guy in a hat saying to the protagonist, "What if I had some ice cream? Wouldn't that be awesome?". The narrative then shifts to Hat's thought bubble, which contains a scene identical to the first except that Hat now has ice cream and Protagonist is saying, "Great, you've trapped us in a hypothetical situation."

Let me explain what is going on here. Hat's idle thought does not magically transport him and Protagonist into that bubble. Hat proposes a hypothetical situation, which is equivalent to referring to one world out of the infinite variety of imaginary worlds. In particular, he is thinking of a world in which he has ice cream.

To be more precise, he is thinking of a world whose history is identical with the real world's, and whose present blesses Hat with ice cream. The Protagonist in the thought bubble remembers hearing Hat talk about ice cream, and now he sees that Hat obtained some ice cream out of nowhere. He concludes that he is living in a hypothetical world, where the law of conservation of mass is subordinate to the whims of Hat's appetite.

A pair of worlds that have identical histories up to a point in time is equivalent to a world that bifurcates into two timelines at some point. Hat's musings split the world into two such worlds: A real world in which he does not have ice cream and a hypothetical world in which he does have ice cream. Hat and Protagonist are likewise split into two copies each.

Of course, the real world and the hypothetical world do not interact. The hypothetical Protagonist only perceives a change in the world around him: From his point of view, reality has been altered, or, equivalently, he has become trapped in a hypothetical world.

This is essentially the problem faced by people who upload their minds into virtual reality simulators: A digital copy of the mind wakes up in the simulation and notes, with satisfaction, that he has been uploaded successfully. When the original biological copy wakes up, he is disappointed to find himself in his old body. (Unless the subject is being uploaded against his will, in which case the biological copy is the fortunate one.) If you really want to live the rest of your life in a computer, you have to arrange for your physical self to be killed once a digital copy of you is made.

Back to the comic strip: The copy of the Protagonist that the narrative follows is doomed to live trapped in a hypothetical world. Or is he? His solution is to summon his own hypothetical world, in which there exists a knife that can cut a path into other worlds, like Philip Pullman's subtle knife. This artifact makes its way into the hand of the Protagonist who summoned it, and then conveys him back to the reality he remembers. Except now there are three Protagonists in the real world: The real Protagonist, the hypothetical Protagonist, and the hypothetical Protagonist's hypothetical Protagonist. If this doesn't make sense, read the cartoon.

The idea of an object or person that can travel from imaginary worlds to the worlds that imagine them is deliciously paradoxical, and is explored in a short story by Borges that I'll discuss here someday soon. The magic in this cartoon, however, comes from the narrative, which makes a surprise departure from a boring situation and leads us through an absurd network of alternate realities.

Wednesday, April 11, 2007

Failure

When you play Super Smash Bros. Melee in single-player mode, the result of a match is immediately indicated by the voice of the Announcer. If you win, he enthusiastically proclaims "Success!". If you lose, he declares

Failure

with a sneering voice so dripping with censure, so inappropriate to a lighthearted video game that I find it funny. But now it's the loudest admonition echoing in my head.

Yes, I'm dealing with failure today.

I'll just say that I'm glad I'm not going into bioinformatics, because this would be an inauspicious way to start.