The Laboratorium
November 2000

Meanwhile, Back at the Ranch

Without giving too much away, if you're following Lotvs-Eaters at all, this would be a good time to be watching closely (and don't forget the message board). [If "Lotvs-Eaters" doesn't ring a bell, see here for an explanation.]

Let's just say that things are going to be happening a little more quickly from here on out . . .

Also, this would also be an excellent time to tell all your friends about Lotvs-Eaters: it's an exciting time to jump in to the story.

Ben is turning into an odd sort of stand-in for me, although not through any conscious intent on my part. It's just that the things he says seem oddly prescient when I apply them to my own existence in exile. The true connections are always somewhere else, it seems. 30'11'00

We Gather Together

I've put some of my writings on the esthetics of online media on the rants page, including some of the Medianstrip stuff. Knock yourself out.

Ninety-Nine and Forty-Four Percent Connected

Slashdot was all abuzz today with its usual chaotic self-congratulatory fervor on the news from two groups of mathematicians that the Internet could survive the removal of some very large fraction of its hosts and still remain generally connected. Which says, I think, not that the Internet really is hyper-connected and super-redundant so much as that the overwhelming majority of sites on the Internet just don't matter and could disappear tomorrow without hurting anyone's connection but their own.

Messages are like sites, too: some matter and some don't. This one isn't in that magical 1%. 29'11'00

Worst Idea Ever

An Australian company has launched a new service to replace URLs with numbers. They claim, in brazen defiance of both history and common sense, to be "first in the world to introduce a fully operational Internet number addressing system" is kind of a mind-blowing claim. Apparently, nobody has ever explained to the Australians about IP addresses.

Not only is making the Internet run on numbers an old idea, it's a spectacularly bad one. URLs made the Internet take off like a moon rocket because they made it possible to start out looking for content in a way surprisingly close to actual human language. In a huge shocker of a development, you could type in something close to "necco" into your computer and have a chance of getting back information about the New England Candy Company. URLs have technical meaning, sure, but they're comparatively transparent to human inspection. URLs are cuddly and fuzzy, unlike those horribly cold and mechanical strings of digits you need to cough up when you want to direct-dial someone's extension at a company in foreign country. Apparently nobody has ever explained to the Australians about putting your venture capital money in companies that might do something useful or make money.

I should note that Nascomms' religious ferver about the new number revolition doesn't extend to their own Internet presence. Look at the contacts page on their site: six mailto URLs and two http URls, but no numbers, phone or otherwise.

What Might Have Been

This article has been popping up a lot today, and no wonder: could it be the case that there are "tetrachromats" walking around among us, people who see not three primary colors but four? A particular combination of genetic anomalies might mean that certain women would have four distinct cone photopigments in their retinas: the usual red, blue and green ones plus a special bonus one attuned to a color somwhere between red and green: a nice vibrant yellow, perhaps.

Would it be yellow, though, or something else entirely? The subjective experience of color is one of the classic imponderable philosophical puzzlers, a sort of training-wheels stand-in for the puzzle of consciousness itself. Our terms for color can be assigned rigorous interpretations based on wavelengths of light, but the experience of the colorblind throws a monkey wrench into any such scheme: they tell you that "pink" and "blue" to them have always seemed like two words for the same color, but that "pink" and "red" are perfectly distinguishable. In face of such considerations, the mind starts to boggle. What on earth do we mean when we say that two colors are "close?" When you see the blue of the sky, is your experience of that "blue" anything like my experience of it, or have we managed to slap the same word on two totally different internal sensations? Does a tetrachromat see the familiar bands of the rainbow, or does their personal rainbow take a detour through octarine in the zone between "red" and "green?" Could the trichromat mind ever properly wrap itself around an "additional" color? Or is the human mind so hard-wired for tricolor vision that it sticks its fingers in its ears and says "nyah nyah nyah" when their octarine receptors fire? [Let me interject here a recommendation for Raymond Smullyan's sadly out-of-print 5000 B.C., the book that first made me realize philosophically how disturbing the very existence of a sense of color is.]

In a way, I'm disappointed that the extra receptor fits into an already-covered part of the visual spectrum. We can see roughly from red, up at 700nm down to violet, down at 400 nm. This is slightly less than one "octave," so that, in theory, the standard representation of a color wheel ought to have a gap between red and purple. In practice, our brain kinda sorta fills it in. But, my goodness, what if we could see colors an octave apart? All it would take would be another 100nm into the infared or the ultraviolet -- four equally spaced receptors instead of three, 400nm range instead of 300nm -- and we'd be able to do with our eyes what we do all the time with our ears: perceive overtones of light. The amazing harmonic richness of music, that whole unfathomable dance of consonance and dissonance, starts from our perceptions of overlap out beyond the first octave. Our visual cortex is wired up to perceive only melody, but if we could see an octave, would we say that a color harmonizes with its doubled-wavelength companion?

Human sensory input, in fact, gives rise to all sorts of astonishing what-if scenarios, not all of them entirely implausible. What if our sense of smell was as systematic as our sense of touch, sight, or hearing, so that we could smell a scent and know if you kept going from cumin towards cardamom you got allspice but with less pepper? What if our eyes weren't designed to be large water-filled sacs, so that our visual range wasn't required to be roughly the set of wavelengths neither wholly scattered nor wholly absorbed by water? What if taste and smell weren't linked so strongly to one another? What if we could echolocate? What if we didn't have proprioception (internal sensors that tell us whether our limbs are flexed or extended)? What if our sense of spatial orientation wasn't acceleration-sensitive?

From my perspective, the biggest scientific story of last year was the computer visualization, based on actual neural outputs, of what a cat sees. [Of course, we don't really know what the cat itself "sees," any more than we know what "blue" looks like to someone else, but still, it's quite an achievement.] This is a meaningful bio-electronic interface, and a pretty dramatic one. Combine this result with the one in which electrodes implanted in a monkey's brain allowed it to control a robotic arm 600 miles away, and you've got a pretty convincing case that the output half of the cyborg equation can be made to work. Which leaves only the input half: what would it take to let our brains "natively" process information not directly provided through one of the five traditional senses? Sure, you can project a heads-up display directly on the retina, but that's still overloading sensory role of vision, tapping into the same limited bandwidth along the optic nerve. The nervous system uses all sorts of cool but usually lossy compression techniques to send image data from retina to brain and I have a sense that there's only so much visual information anyone can process.

No (and, in case it's not clear already, I'm jumping off into one especially huge what-if right now) you're going to need to plug directly into the grey matter. But that gets you into plasticity trouble: if all sorts of seemingly random neural input started rolling into your brain, would you suddenly be able to piece it together into a smooth map of the electromagnetic fields in your area? I kind of doubt it: the sudden addition of a new modality of input into a portion of the brain already in use for some other purpose would either completely muck up your normal processing or be shrugged off as noise. I have this sense that if cybernetic input implants will ever work, they'll need to wired up to the brains of the very young. You go about your day-to-day life for five years getting constant GPS information and your brain will learn how to interpret that information, will have this strong but oddly inscruitable sense of position. Ask a kid with a GPS unit what it feels like to have one and he won't be able to explain: it'll have become native to him in the same way that touch is native to all of us. So in the end, this particular thought experiment doesn't really tell us much, I fear: it's one crazy might-have-been to imagine what life with certain bio-chips would be like, but the actual experience would nonetheless be so eerily different that such speculation is more a task for artists and poets than for scientists.

I had a couple of conversations last year, right after the news about the cat, following out the societal implications of this particular theory I've advanced about the limits of mental implant technology. For one thing, if you take the plasticity argument seriously, anyone with an implant would be stuck with the technology level current at the time the chip was implanted. Once it and the brain have agreed on a common language, there's no way to pop the chip and upgrade the API. Given little trends like Moore's Law, the implanted would be carrying around amazingly obsolete pieces of tech in their heads, which makes for something of an interesting twist on steampunk fiction. Most science fiction I've come across that takes the cyborg possibility seriously either assumes infinite adaptability -- pop out one chip and pop in another and poof, you know kung fu -- or a glacial rate of technological advance -- you don't see Lando trading in Lobot for a newer model every eighteen months. It would be interesting to to see some speculative sci-fi that follows this more neurobiologically-influenced way of looking at the prospect. The built-in limitations of the approach are rich with dramatic potential, I think: wars fought between modern-tech humans and electronically-enhanced but out-tech'ed cyborgs, societies where parents must make enormous gambles on picking the right electronic legs up to give their children, strange Y2K scenarios striking outmoded firmware sitting in people's brains.

And now for something completely different, as long as we're playing what-if. In the moon's motion around the earth, the same side always faces towards the Earth. Centered at 94 degrees west longitude and 12 degrees south latitude (in the obvious lunar coordinate system, given the preceding fact) is the Mare Orientale crater, about 700 miles across (the moon itself is about three times that in diameter), very nearly circular, and ringed by a huge bulls-eye from the impact. That is to say, one of the sides of the moon we never see (because it's rotated 90 degrees away from us) bears a very strong resemblance to an eyeball.

The mythology of the moon is peaceful and soothing. The Man in the Moon sprinkles moondust on nighttime dreamers: the moon is the calm protector of the natural and the feminine. Sure, the sun is an angry sky-god who burns with a fierce flame, but the moon brings healing and safety. Well, the friend who told this tidbit to me asked, how would human history have been different if every night the sun went down and the eyeball came up? How much worse would our fear of things that go bump in the night be if everything we did at night was seen by the great implacable eye in the sky? What paranoia would stalk our dreams, what fearsome offerings would we have given up to the unforgiving all-seeing eye, what ingrained terrors far above and beyond anything Jung ever imagined would be seared into the consciousness of humanity?

It's in the interstices that the might-have-beens live, and in these gaps you find also the clues to what still might be. 28'11'00

Sell, Sell, Sell!

Neal Stephenson is republishing his 1984 debut novel, The Big U. Stephenson's cult status among geeks, combined with its seriously-out-of-print status, have combined to push prices for a of the original cheap paperback edition up into the hundreds of dollars. One fellow on Amazon's zShops is asking $650. I'd have to say that it looks as though the market for copies of The Big U is about to tank in a way that'll make NASDAQ stocks look like good long-term investments. If you've got a copy and you're ever planning to get rid of it, I'd advise selling now before news of the republication spreads much further.

Of course, if you don't have a copy and can't wait until February, there's always good old-fashioned Internet samizdat.

Children’s Television Gone Horribly Wrong

I just discovered that Aardman Animations (note the British usage of the plural) has put a veritable treasure trove of their short films online. It's your duty to watch Pib and Pog right now (free, registration required). There's something about a British accent that turns slapstick into highly dignified entertainment.

You’re So Technical

I'd like to encourage you not to bookmark this page as "notebook.html". I've changed the default resource on the site such that navigating to brings up the latest edition. If you remember the old "you really want to be looking at notebook.html" page with the annoyingly slow auto-redirect, it's gone now.

Underwater Excess

Nieman Marcus is offering a 118-foot luxury submarine. Please, please, tell me that this a joke (that low, low price of $0.00 gives me some hope). By way of comparison, the drug-smuggling sub being built in the Colombian Andes was only 100 feet (although it probably wasn't going to be used for transatlantic crossings, the way the Nieman Marcus sub can be).

Stopping Superpower

A frightening article in this week's New York Times magazine section (called to my attention by Dave Krinsky) discusses the easy availability of .50-caliber rifles. Although a .50 can hurl a six-inch bullet several miles with enough momentum to go through a three-inch thick manhole cover, you can get one with no more fuss than for any rifle purchase, and once you have it, no governmental records are kept to track it.

I don't go in much for teleology, but this is a case where the argument from design holds together, I think. The telos of a firearm is to shoot at things, and the .50 distinguishes itself by being useful only for shooting at big, well-armored, and slow-moving things. The telos of a .50-caliber rifle is to fight against an army, one with tanks and fortifications. At 37 pounds and 54 inches, it's actually not much of a weapon of mass violence or indiscriminate terror. You don't pull out the .50 in a domestic dispute, you don't take your .50 when you go shoot up an office building. You need your .50 when they come for you with the black helicopters and the armored personnel carriers.

The .50-caliber rifle puts Second Amendment issues in a particularly harsh and merciless light, I think. The impracticality of the .50 means that it is useful only for those purposes firearms-rights advocates constantly trumpet: to secure the bearer against an invading and well-equipped force (of the sort that only governments and major mercenary organizations can muster), or for the pure recreational thrill of going out in the desert and shooting up rocks and old junk. It's a bit ironic that the most powerful weapon available today is the one most immune to standard gun-control arguments, but any debate over the legality and uses of the .50-caliber rifle really does come down to the question of whether you think individual citizens should have the firepower to effectively and violently resist the government.

That is to say, having thought a bit about the .50 and its dangers, I've realized that I am a genuine anti-purist when it comes to the Second Amendment. I don't want .50-caliber rifles in the hands of Joe Public, because I really do sleep better at night knowing that Joe Public can't take his county in Idaho out from under the jurisdiction of the United States in America. This is a measure of patriotism for me: I trust this country, in the large, and I trust it and its institutions not to engage in wholescale oppression of the sort that is the subject of paranoid fantasies. Several hundred years of history have told me that in a conflect between the Feds and the locals, it's usually the Feds who are in the right. There are two good reasons why the US didn't devolve into the "Election Mayhem" described by The Onion. First, we have a well-established political process and a well-established national culutre of peaceful transfers of power and legal resolution of disputes. And second, when the National Guard gets called out, order gets restored -- and people know it, which is one major reason why the National Guard doesn't get called out all that often.

Put it this way: I'm not afraid of .50-caliber rifles; I'm afraid of what the people who own them might do with them. Guns don't stage heavily-armed insurrections, people do.

Shouted at Me From a Moving Car on Broadway

". . . love you, but everyone else thinks you're an asshole!"

Yes, sorting will give you an answer in time N log N, but can you find the missing number in linear time? 27'11'00

Further Updation

I've pulled out some of my rambling and long-winded stories about the oddly funny disasters that have befallen me at one time or another and put them in their own archive page, now also accessible through your favorite sidebar.

Damn, Claudius makes it easy to create making topical archives. A tip of the hat to Adam Mathes, whose design for Organizine (still in beta as of this writing) gave me a couple of the key insights behind Claudius.

More specifically (non-geeks can stop reading here, and jump straight to the next entry, which contains some amusing profanity, or the one after that, which contains some fun left-wing ranting), I've borrowed from Organizine's notion of attributes the idea that the metadata should reside with the data, rather than with the indicies. Thus, that descriptive text you see on an archive summary page lives on the pages that it links to: the summary page is actually generated from a file that consists of nothing more interesting than what is effectively a sequence of #include directives. This is the other, related, notion, I got from the Organizine way of doing things: the data inherent to an archive page can be factored into an index -- which pages is it an archive of? -- and the formatting -- what do you want to do with the data from each of those pages? It sounds obvious to me now, but it wasn't at all obvious until I saw how Organizine handled such things and how clearly right these particular design choices were. This is about the highest praise I can think of for the intellectual elegance of a software system: its design seems obvious, but only in hindsight.

I've chosen, in setting up Claudius, to depart from the Organizine philosophy in some ways. My site here is pretty strongly-typed: I don't actually validate my XML against a schema, but in theory I could, and, more to the point, I've got a multi-layer SECTION/ITEM hierarchy which gives me more structure to manipulate when laying out pages, rather than just a bag-of-tags. Conversely, I don't have separate notions of "data" and "archive." An archive page consists of a bunch of <DOCLINK> tags where you'd expect to find <ITEM> tags containing actual data, but there's nothing to prevent me from mixing <DOCLINK>s and <ITEM>s in a single page (which I will start doing fairly soon, in fact). Again, more to the point, making archives first-class objects means that I can use the same stylesheet for everything, in keeping with my XSLT-all-the-way philosophy.

All of these calls fit in with my personal connection to the Lab: I'm comfortable hand-authoring XML, writing my own XSL stylesheets, and in dealing with the tedium of remembering which pages are dirty and need to be reloaded onto the server. All in all, Claudius is probably not a content-management scheme for the masses. Or for anyone besides me, I suppose, because once you're in a position to understand how it works, you also understand how little there is to it, and you say "I could do that!" And off you go and do it, your own way, taking some of my thoughts and tossing out others to replace them with features more closely suiting your needs. And then you have a site of your own with a content-management system that you baby and babble about just a little too much, and we all win.

Massive Cleanup

Whew. Brought the archives fully back online. Big whoop.

Yes, but having the archives in the new format gives me the chance to start doing redesign and reorganization work again. For example, now that I've settled on this two-level "section/item" schema for my XML, I can consistently drop day-by-day headers into the archives. Plus, the new format is amenable to proper archiving, so I can actually pull my writings on a given topic together properly. I've only rolled out a couple of sections, but if you'll direct your attention over to the sidebar, you'll see a "Position Papers" link, which takes you to my new writings index page, conveniently sorted by subject. No new content, yes, I know, but the old stuff is a lot easier to get at now, which I think counts half a point.

Do let me know about broken links if you stumble across any. Thanks.

We seek him here, we seek him there, Those Frenchies seek him everywhere. Is he in heaven--is he in hell? That damned elusive Pimpernel?" 26'11'00

When Worlds Collide, pt. IV

Makeup, like comments, can hide only so much. 19'12'00

When Worlds Collide, pt. III

You just don't get it. Use your head. 24'11'00

Technology and the Long View

I've written some musings on humanity and the Internet as part of a discussion on the website of the show Closer to Truth, produced for public television by KCET. Perhaps nothing especially insightful, or even consistent, but I had fun writing it, even when a cache-expiry problem ate my first draft.

These aren't the droids you're looking for. 23'11'00

Presidential Ponderings

I think this drawn-out election is having the same effect on me that it seems to be having on the Gore and Bush camps: the more it goes on, the worse a person I'm becoming. When the Miami-Dade canvassing board voted to stop the recount, my first thought was "I wonder who got to them?"

Then, today, when it turned out that Cheney's chest pains really were heart attack number four, it struck me that according to the Democrats' preferred rules for chad, we'd have to classify Cheney as "dead."

I am continuing to be very impressed by our nation's court system. Every single decision handed down by a judge in the many election-related cases of the last two weeks has been, in my non-expert opinion, entirely correct. The lower courts have avoided interfering with counties' decisions, the federal courts have stayed out of state-level decisions, and the Florida Supreme Court took a very reasonable position on both the meaning and the purpose of election statutes. All is not entirely lost.

I'm still looking for the right metaphor to talk about the Gore/Bush fight. The feeling I get now is that their antipathy for each other is a hatred on such a titanic scale that the election itself was insufficient to work it out. Even when it seemed their grudge match would have to finally be settled, they found a way to prolong it and strap the American people into chairs with eyelids taped open to watch them tear hunks of flesh from each other. While I recognize that it's been wholly in Gore's tactical interest to appear magnanimous and in Bush's to be snippy, I do actually like Gore more and Bush less as a result of these last few weeks.

In other election news, two weeks of sifting through absentee ballots have given us a new senator here in Washington: it looks like Maria Cantwell has toppled Slade Gorton in a .08% squeaker (pending the obligagory recount). Interestingly enough, .08% is also the legal threshold for intoxication in this state, as the big yellow signs on the roadside every few miles like to remind us (although I do have a friend who botched that question on his driver's test). Total coincidence, I'm sure. Cantwell's victory has inspired me to deliver the following message to one particular co-worker: in your face, rich boy!

Sadly, I don't think he reads this page. and neither does Gridlock Guy, who I'm sure is quite thrilled at the prospect of a 50-50 Senate.

Close, but no cigar. Look down. And, more importantly, look up. 22'11'00

Pop-Culture Paradox

Douglas Coupland likes Dave Eggers' book.

Dave Eggers kinda likes Tom Frank's book.

Tom Frank does not like Douglas Coupland's book.

Pay no attention to the man behind the curtain. 21'11'00

One Medium at a Time

The Amazing Adventures of Kavalier and Clay is, in large part, a novel about comic books, the classic superhero titles from the Golden Age, half a century ago. The novel reflects well on comics; you finish reading it with a newfound admiration for the formal techniques of the comic artist, with a sense of what remarkable things the comic medium can support.

Then you remember that you've just finished reading a novel, not a comic book, and that everything you've learned and "seen" about comics has been conveyed through words and words alone. And this is why I think that The Amazing Adventures of Kavalier and Clay is also a novel that increases your respect for the possibilities inherent in the novel.

This is just one of Joe Kavalier's little bits of artistry, a false pass to hold your attention for a moment of distraction. 20'11'00

Weblogs As Art

There is a man in Oregon, retired now, who spends all his time at his typewriter, recording the minutiae of his life, commiting to paper every detail of his days. He is a figure for our age, this man for whom existence has given way to the existential act of recording. This man is the Lucifer of webloggers, the ideal of fallen perfection against whose temptations the medium's authors steel themselves.

A personal weblog is a lie, comprising a promise and a betrayal. The weblog promises access, a direct line to the personal experience of the writer; the weblog betrays that access, severs the connection somewhere along that fragile line.

Some weblogs edit: the things they speak of are the exterior things, the public events. The writer walks invisibly through these landscapes without mirrors; the narrative steps gingerly around anything that matters. Others conceal: names are removed, identifying details blur, the points of contact with reality slip and are sundered. In these weblogs, there is a direct line to a psychic inner space, but precisely whose inner space it is has come somehow unfixed.

The self needs at a little space in which to live, a few shadows that exist apart from the world's sunlit gaze. Scratch out the names and identifying details and what remains is fit for public consumption, lacking that crucial last link from welog to reality. Leave in the details but leave out the context around them: this, too, may be shared with the world, as recognizably real people are being described, but not in a way that lays them bare for your inspection. Allow them both, and you may be sure that still the author has slipped the net, that what is written about is somehow only that which misses the point. Or perhaps the whole is pure invention, and only mimes the gestures of true confession.

An absolutely honest personal weblog is a contradiction in terms: if that promised access were to be achieved, in the instant of its realization it would eradicate anything recognizably "personal" about it. This is the strange secret of our Oregonian diarist: he has achieved the impersonal, he has flattened out his existence in the process of telling it to the point where it has ceased to be the experience of a person and has become something else entirely.

The gift of friendship is the gift of access, the backstage pass that takes you to places about which most people can only speculate. Intimacy is created not by honesty, but by secrecy, in the carving out of a shared psychic space whose nature is kept hidden from the world. A weblog is an inviting-in, a gesture of friendship's secrets extended to the reader, but also it contains a shutting-out, a closed door behind which the next entry's secrets are being created.

The autobiographer has a pre-prepared escape: the delay between penning and publication itself is enough. Identity shifts with time; a map of the psyche becomes more unreliable with every moment that passes from the instant the cartographer sets down the pen. A weblog does not so much progress as unfold: its immediacy and its continual extension combine to remove this wiggle room. The obstruction, the obfuscation, the deception: these must be written in, since they don't just happen.

A weblog is a form of performance; it recapitulates the dramatic journey of the play. The audience is excluded from the action, from participation in the events they see enacted before them. Weblogs do not speak to each other; readers and other webloggers enter into a weblog in the third person, as part of the great undifferentiated out there into which all links point. In the theatre, the lights come up on the audience at the end of the performance, as they are invited, through their applause, to bear witness to what they have seen, to indicate their acceptance of events as the actors have presented them.

What is important in the personal weblog, too, is this bearing witness: the writer has crafted a story of self and given it to the reader; the reader's role is to receive this story and to validate it. Even the most "unedited" and "direct" of weblogs operates according to some strategy of editing, some technique for the arrangement of events that contains its particular promise and particular betrayal. It is this choice of strategy that defines a weblog's style, that draws the reader in to become a willing conspirator. The weblog is a technique of self-knowledge, mediated by the reader.

Every entry in a weblog has a twinned significance: one meaning for the author and one for the author's imagined readership. What the reader sees is not necessarily what the author gains from that seeing. For poetic and other obscure weblogs, this observation is self-evident, but it holds true even for weblogs that appear more transparent. No entry in a personal weblog has ever fully explained why it was posted: no entry ever can, or will. The writer retains the memory of the betrayal built into every entry: this memory deepens the meaning, complicates and enriches the bit of self that has been poured out through the entry.

A personal weblog is a series of still images placed atop one another like a flipbook. The spaces between entries remain dark, penetrable only by conjecture. String them together, play them in order, steep them in the hiddent themes of memory and mood that connect successive frames, and there you have a life: a stuttering animation whose full fluidity is visible to one person and one person only. The act of arrangement is the act of narration, the transformation of an impersonal happening into a personal story. The weblog as a medium is defined by this paradox: a public act produces a private narrative.

So, then: anything truly personal in a weblog is kept under wraps by the writer; the text carries the whiff of the personal, but remains careful not to actually reveal it. The contents of a weblog are deliberately arranged by the author, no matter how natural the result may appear. What results is a work of monumental self-contemplating ego, one that the reader can neither fully comprehend nor fully participate in. And yet, the reader still participates, still willingly submits to the promise's betrayal, still sfits eagerly through these leftover scraps from the author's table of existence. Why?

Because the weblog is art, and such things Art demands.

Pioneer Square Vignette

As was I fishing some change out of my wallet for a pair of panhandlers, one of them said to me with a sharp exhalation, "You an engineer!"

A T-Shirt That Ought To Be

Don't Blame Me: I thought I was voting for Gore.

Interdisciplinary Discoveries

This article (sadly, Springer isn't making the full text available to Joe Public, this link is to an abstract only) brings together Arrow's Impossibility Theorem and Turing's notion of computability. In it, H. Reiju Mihara closes off one of the possible escape hatches from Arrow's Theorem on the impossibility of finding a voting system that is both fair and rational. The requirement that the voting system be computable, it turns out, nullifies certain attempts to wiggle out of the rationality requirement.

On the other hand, as cool as it is, this result only applies to societies with an infinite number of people in them, which does tend to limit the practical importance of the work.

The Lotvs-Eaters Launch

Here at the Lab, we're proud to present In the Land of the Lotvs Eaters, a serialized story, taking as its medium the weblog.

Meet Ben Duggan. Ben's living proof of the rule that you don't have to have something to say to have a weblog. Ben himself would probably call it "just barely living proof." He's in a bit of a rut right now; he's more than a little disillusioned with the narrative of his life. He likes Saturday-morning cartoons; he hates people. He's not really a bad person, just a little, well, bitter. Ben has a worse-than-dead-end job, a noodge for a roommate, a useless college degree, no love life to speak of, a Playstation, a sarcastic streak a mile wide, and a weblog. And it's on that weblog, located at, that Ben's story, narrated in his inimicable style, is about to unfold.

What's over there right now is prologue: some background to get you into the swing of Ben's life, fill in few of the key phrases in Ben's cynical argot, and generally introduce you to Mr. Duggan. The plot, such as there is, will kick in a bit later on this week, and will run for about a month, in fairly frequent installments -- basically, whenever Ben feels like yammering about himself for a bit.

Ben is, within the confines of his invented reality, a "real" person. I have a plot outline in mind, but I'm not writing ahead: every day, Ben drags himself out of bed and says whatever's on his mind, whether it's a rave review of Legend of Drunken Master or the way he completely wasted the weekend. He's got an email address, and he's got a message board, where he and his roommate make fun of each other and their guests. You're welcome to participate in the discussion, but please respect the narrative boundaries of the story. This story is not about being "meta." Well, at least not in that particular way . . .

As previously hinted, the Lab will be in power-save mode for the duration of the Lotvs-Eaters run. I've got a good feeling about this one: I've got some ideas I've been juggling for a while, and Ben's so-called adventures -- the voyages of Vlysses, if you will (hiss, groan)-- feel like the perfect chance to really develop those ideas and do something interesting with them. Please give Lotvs-Eaters a look; I hope you enjoy reading it as much as I'm enjoying writing it.

As one final advertisement for and introduction to the story, I'd like to reprint here something Ben wrote:

Twenty Rules for Life

  1. Your attentions are inherently offensive to others.
  2. They are not talking about you, not ever.
  3. Remember to major in something fun, because it is impossible to major in anything useful.
  4. Never make phone calls, because you might have to leave a message on an answering machine that doesn't let you try again until you don't sound like a stuttering idiot.
  5. Inability to work up courage is your brain's way of telling you don't really want to do it.
  6. Decide up front whether you'd rather be invisible or humiliated, so you know what to expect.
  7. The danger of setting low expectations for yourself is that you feel even more pathetic when you fail to meet them.
  8. Being smart creates more problems than it solves.
  9. Look forward to growing old, because when you become senile you will finally be able to forget all those painful memories you wish you could forget but can't.
  10. Job satisfaction is for the rich.
  11. Someday you will meet someone, and you will just know, and you will be wrong.
  12. That bitterness you feel is justified.
  13. Being able to articulate what the problem is won't make it go away.
  14. It feels good to be needed; it's a good idea to have a few really needy friends.
  15. Somewhere out there is a planet where Playstation is a marketable skill; your task is to find this planet.
  16. There is no memory so good that it cannot be poisoned by later events.
  17. Don't get uppity.
  18. Just because you can't live up to your principles doesn't make them bad ones.
  19. Really good friends will lie to you if they think it will make you feel better.
  20. Your computer does not love you back.

And oh, yes, one last warning. Loyal Lab readers will undoubtedly know that, at times, the content of this site has been a bit unsuitable for the young'uns. Ben's web page is the same way, only he has a potty-mouth, too. Leave the tykes behind when you surf on over.

Season of Star-Bellied Sneetches

People turn scary during elections. There are the ones who vote their pocketbooks, and proudly. There are the spiteful ones, the would-be agents of vengeance against their political betrayers. And then there are the gridlock fans, the ones who think that government is best which governs most paralytically.

It's time to lay things on the table, time to start asking the big questions again. I don't have answers, I just know that there are certain doors I need to thrust my foot in while I think things over. There are possibilities that must be kept alive, other ones that should perhaps be resuscitated.

I'm living like a magazine; my life has a lead time to it now. Turn the wheel to port, and she'll heel round eventually. The time horizon is coming back, I'm projecting outwards and forwards again. I think it happened without my knowledge, but I have that sense again, the psychological contiguity between the me of today and the me a month hence.

I know some perfectly wonderful people, but if I'm going to turn into them, what am I doing here?

Things are pulling closer, too: the here and now has a growing urgency. There's more to do than I have time for; I'm aware of the importance of minutes. I've got things to say, plots in motion, something bubbling on every burner. That's the good china up in the air, gotta be extra-careful here. Lotus-Eaters is about to hit the point of no return -- any day now, I promise -- and that will be that, but that shelf keeps on stretching, mocking me, and this election is inspiring other things, good grief, how to fit them in, too.

And you may find yourself living in a shotgun shack, and you may find yourself in another part of the world, and you may find yourself behind the wheel of a large automobile.

Making choices, that's where it's at. That's what I'm doing, really. I'm doing whatever I have to do to make sure I get the right chances to make choices. Day by day, hour by hour: that's how I reassure myself that things are all right. At this instant, I say, I am aware that I have other options than this particular one. I recognize what I am forgoing, and I make this choice of my own free will, so I affirm on the dotted line. That's what it's all about -- it's not about the choices, it's about being able to make them.

Did you ever have that feeling, that maybe somewhere out there there's a bookstore and it's calling your name? Well, what're you going to do about it, whitey?

Buzz. Buzz.

Counting Towards Infinity

I have nothing but sympathy for the election workers in Florda. I also seriously want a hand-recount. Both of these attitudes are strongly informed by personal experience. Let me explain.

Freshman year in college, I was one of those crazy types who was into student government. Don't ask me how I wound up that way; it was a mixture of running with a progressive crowd, youthful idiocy, and pure random happenstance. In any case, the time came, in mid-spring, to hold a campus-wide election for the President and Vice-President of the Undergraduate Council.

This was kind of a first. In the past, the Council had always elected its own officers; thanks to a bizzare set of steps and missteps, it was decided that the student body as a whole would elect the Council's presidng officers, but wouldn't invest them with any authority beyond that conferred by holding the gavel. In short, a recipe for post-election disaster. In this case, though, we got disaster as part of the elections, too.

One of the messes that had led to the public uprising that created popular elections had been a series of ballot-box-stuffing scandals that had beset previous Councils, including some especially embarassing incidents in which Council members attempted to thwart an anti-UC ballot iniative by simply throwing out some of the ballots from the boxes they were responsible for (voting was, at that time, conducted in dining halls around campus).

In response, and partly to jump on this new "computer thingy" bandwagon, the UC had commissioned a voting program to hold the elections for its representatives. Written by a fairly zealous freshman, the program reused the UI from his successful sectioning program (used to sort students into the discussion sections of large lecture courses), stapled to a hastily-whipped-up back end that did the tabulation.

Unfortuantely, when it came time for the campus-wide elections, we realized that the existing program wouldn't work. For one thing, it was wildly insecure, as in, it didn't even try. Section assignments, well, nobody really cares about the breach of privacy involved in finding out that Monday at 5 was your first choice, but Thursday at 4 was your second choice. In the massive scruitiny and debate surrounding these first-ever popular elections, though, everyone was all of a sudden expecting that this election be "confidential," and "fair," and "trustworthy."

Even more seriously, the program simply wasn't going to work for the campus-wide elections. To elect its represntatives, the UC uses the Hare Proportional voting system, a complicated system for electing multiple delegates that requires rank-ordered preferences and periodically reweights every extant ballot. To elect President and Vice-President, the newly-amended UC Constitution specifically required the Single Transferrable Vote system, a slightly less convoluted scheme that also uses rank-ordered preferences, but manhandles the ballots in a marginally different way. Oh, and also, the old program had no concept of multiple elections: in order to use it, we'd have had to require people to vote twice, using two completely separate processes.

Rewriting was out of the question. The program's author wasn't interested in the job, and everyone else who looked at the code was afraid to touch it. There were sections in it that -- we are not making this up -- read like:

candidates[candidate]->candidate = candidate;

This is basically where my role in the whole sad story begins. Being young and naive, I'd volunteered to be on the Election Commission overseeing the voting. There were three of us from the UC, and three students at large, and it turned out that I was the only one who knew anything about computers. Suddenly, I was the point man, off to go talk to the Computer Society about commissioning a new voting program. Mostly I was a go-between in those first few days, reporting back on how much money they wanted and so forth. Amazingly, we convinced the UC to pony up the bucks: enough to bring the Computer Society safely away from the fiscal insolvency that was a recurring problem for it, and also to buy a computer that could be physically secured to serve as the vote-server. People would make secure connections to the box and vote in encrypted sessions. No member of the Election Commission would retain the entire password for the server; the machine would be physically located in a locked office owned by the University's Computer Services people, who were pretty generally acknowledged to be real tightasses about secuirty issues.

And the program? Gleaming, state-of-the-art software! A scripting-controlled client app would negotiate a secure session, then hand over control to a graphical curses UI that would let you read position papers, respond to survey questions, and rank candidates in an arbitrary number of "elections." A back-end database, updated with the latest info from the Registrar, would confirm that each student could vote only once. And a shiny new modular C++ vote-counter would allow you to plug in your choice of algorithms before feeding back to you the beautifully-formatted tabulation of votse.

Such was the plan.

The first thing to go wrong was that, come twelve midnight on the first day of voting, the program simply refused to work. Nobody could log on to vote. Of course, there were candidates all over campus, trying to pull stupid student-government publicity stunts by being the first to vote. The crack technical team assembled, looked at stuff, discovered that yes, the server wasn't accepting voting logins even though the test-run had been successful. About half an hour later, someone figured out the problem -- file permissions on one of the election data files, I think -- and voting commenced for real. The Election Commission -- already sick of the election, thanks to the 2000 signatures we'd had to verify to certify candidates' nominations for office -- met, agreed to extend the tail end of the election, three days in the future, by half an hour, and went home.

The next day, while adjudicating a dispute between one candidate and his former best friend, who'd accused him of putting up campaign signs too close to a polling place (with electronic voting, this now meant any network-accessible computer), we learned that some people still couldn't vote. Ha ha, hee, it looked like the Registrar's database of eligible student voters wasn't so accurate. We went over, had them re-run the query, this time being more careful about including the international students. A quick patch of the database on the server, and all was well.

Until the day after, when it seemed that some students -- those just returned from leaves of absence, I think -- couldn't vote. Ho hum, we knew how to deal with this one. The Computer Society people went over to the Registrar's, got a disk with the re-refined query output. Someone on the Election Commission had to meet them at the computer room to let them in, so down I went. And I watched as the crack programming team proceeded to overwrite the list of candidates with the list of eligible voters.

Whoops! Slippery little devil, ain't it?

We pulled the network plug from the wall and considered the situation. No backup list of candidates, and people were going to start noticing, soon. We retyped the file by hand. We had no clue about the order of candidates, so people logging in after that point would see a differently sorted list, but since we'd gone with a random order for the first cut, we figured it didn't make much difference.

The day after that, when the election ended at 5:30 PM, we gathered around for what we hoped would be a mercifully quick process. Some of us, with particular partisan leanings, were somewhat nervous about the outcome, but we kept our thoughts to ourselves. The Election Commission was split among two unspoken supporters of two major tickets -- although by the end of the election, none of us liked any of the candidates (with the exception of the bullhorn guy and a doomed sixth-party candidate) very much. In any case, being seen as impartial about the results of the election were the least of our worries. First, we needed results.

And that counting program? Well, um, ah, it wasn't ready yet. The lead programmer was still debugging a bit, the president of the Computer Society assured us. But he'd be done within the hour. We should start the process of getting to the vote files. And this was when we discovered three major problems.

First, we couldn't get at the files. There was something in the security setup that blocked us from getting the raw file over the network cable onto the tallying machine.

Second, despite many repeated promises to the contrary, the election program let you quit within recording a vote in one or both of the elections. You had to go through a fairly innvolved procesure, but it was quite possible, and in fact, some people had managed to pull it off. This made some of the ballots invalid according the file format we were using to store them. Oh, joy.

Third, there were more candidates listed in the ballots than there were in the election. When retyping the data after the overwrite, we'd misspelled one candidate's name, and dropped the middle initial from a few others. Result: separate vote tallies for "Al Gore" and "Albert Gore," as it were.

Oh, and yeah, thanks to someone's undoubtedly brilliant design insight, the ballots stored the voter's identifying information in cleartext in them. Which meant that it was unethical to actually look at any of the ballots. Oh, no, our high-minded early-in-the-night ethics told us we'd need to repair the damage without looking at the ballots.

My friends, grep is also my friend, and it should be your friend, too. While most the computer people were off trying to help their compatriot debug, a friend whose part of the system had worked flawlessly and I started monkeying with various -A and -B arguments to grep, piped and filtered in various ways. I managed to extract a count of the number of ballots missing Presidential votes, or Vice-Presidential ones, or both. And then we did some global search-and-replaces to patch up the variations in candidates' names to a consistent state. As for getting the files from point A to point B, we emailed them, in the clear, and may God have mercy on our souls.

At which point, getting into the evening, the tallying program still wasn't ready. The candidates, some in high dudgeon, had started to come round to find out the source of the delay. They had celebration parties on hold because of us; the only response we could usefully give was to pull aside members of their campaigns and suggest that they use the party supply of alcohol to try and calm down their candidtes

We demanded a hard estimate of the program's readiness from the Computer Society. Eleven, they said. Okay, we said, and we made an announcement, by email and by one EC member's booming voice, that we'd announce the results at midnight. There was grumbling, there was muttering, I scarfed some dinner

Come eleven, the program compiled, but segfaulted. Come eleven-thirty, it was still segfaulting left and right, and the programmer was poring through core dumps to figure out why. The Election Commission met again, agreed that we couldn't yet announce a winner, and agreed to not promise any more firm announcement times.

They talk about long nights of the soul. What followed was a long night of body, mind, and soul. We gave up on the counting program, although its author didn't. The president of the Computer Society, acting the part of the classic technical manager, stared at the screen blankly, tried tapping out a few things rather feebly, and then went off to go talk to the press. We called in the original vote-program's author. He looked at the situation and started coding a Perl script to compute the totals. Candidates screamed at us; we had the big people on the Election Commission deal with it. One member of the EC, looking miserable, curled up in a corner of the room and tried to sleep. Two others had to go board a bus for an off-campus event they were due at in the morning. We called the alternate, woke him from his sleep, and plodded onwards.

Towards morning, the magic Perl started producing results. The totals weren't obviously wrong -- they added up properly at the start, and the numbers moved in the right direction -- but we really couldn't tell for sure. The morning campus paper got delivered, at which point it became clear that the folks in the Computer Society had sold us up the river. Our carefully neutral statements, when mixed with their blaming us for everything that had gone wrong, led the reporter to do the obvious reporter thing and paint the Election Commission as a bunch of confused incompetents. I started to seriously consider transferring to another school. Oh, yeah, this was also the point at which the last remaining Computer Society people basically said, yeah, bye, been nice knowing you, and went home to sleep.

We got the second tally done by using grep, tail, cat, and a jot of perl. In essence, we shredded the ballot file into separate files, one per ballot, and sorted in directories based on candidate. We counted files, figured out which candidate to eliminate, struck their name from every ballot, and resorted into directories. The final numbers matched those from the first count, which in turn almost matched those from the election. I did some additions and subtractions, and managed to persuade everyone else in the room that the missing ballots exactly matched the number I'd computed for those ballots missing one or both parts of the election. We checked the write-ins -- who we'd completely ignored up to this point -- and convinced ourselves that the write-ins were numerically too few to have upset the (slightly fragile) STV algorithm at any stage. Then we felt hope, that maybe there might not be a lynch mob waiting for us outside the door.

In fact, there was no one at all there. It was early Saturday morning, one of the most beautiful mornings I can remember, dewy and bright, cool but warming in the sun. I was absurdly tired, and depressed, as stressed as I've ever been; I think the others were basically in the same state. We talked through the election, convinced ourselves that the count was accurate and that there hadn't been any fraud of the sort we'd ever have a chance of detecting, called the candidates, and went to sleep. The election was counted by two sets of Perl scripts written on the spot, with hand-tailoring by half a dozen people, all of whom were alone at the computer at some point or another. We did what we could; don't blame us for it. Have sympathy on the poor election worker; don't bring down your own election stresses on their heads.

A Postscript on Software Engineering

The C++ tallyer was never completed. The UC threatend to withold payment for the voting software; eventually, the Computer Society relented and agreed to recode the entire project again from scratch. That program was deployed the year after, to no noticeable failure. I wasn't on the Election Commission or the project programming team, and by the end of that year, I was out of student government entirely, too.

My senior year, in the middle of the fall, I got a desparate phone call from the president of the UC. The voting program kept on crashing during the vote-counting for the general election. Was there anything I could do? I wanted to say no, but somehow I got suckered in to going up to the office and having a look at it. The first thing I noticed was that all the old detritus from that first year was still laying around on the hard drive. Two complete backups of the ballots, the various perl scripts, our temp files with hand-typed lists of subtotals.

Digging into the problem itself, I found that the program crashed on trying to process ballots that contained only write-in votes. The ballot's list of candidates was maintained in a linked list, but trying to get the list head turned into a segfault when the list had been reduced to emptiness by throwing out eliminated write-ins. It turned out that it was actually a template-based smart linked list class, and that trying to get the head of the list triggered an assert failure for empty lists. Just one of those things. You can't blame the infrastructure guy who wrote the class: you should check for empty with an isEmpty() method, and this way the head() function satisfies stronger postconditions. Nor can you blame the algorithm-code gal who used the class: the operators were overloaded to make it look just like a pure C pointer-based linked list, so why shouldn't she expected it to operate like one? YOu can blame the process that lets things like this sneak through, that doesn't communicate to the people on the two sides of an interface how that interface operates.

And you can also blame anyone who puts their trust solely in software. Don't just sit there and trust the machines. Trust the people who operate the machines, the ones who might know whether or not it's safe to trust the machines.

The Depth of Outrage

My position is, and has been since Tuesday, simple: recount, recount, recount. Recount in Florida, recount in New Mexico, recount in Iowa and Wisconsin, too. Recount in California and Texas, too, if anyone asks. Recount every single vote in the country, if there's any question about it. Do it carefully, slowly. do it right. Have three people look at every ballot, have the tallies checked and cross-checked by anyone who wants to inspect them. Take the weekend, take next week, take until the day before the electors meet if necessary. Do it right, announce the totals, hand out the electoral votes, winner takes all. And when everyting is confirmed and definite, the loser concedes graciously and wishes the winner well, and America's next President can take office free of doubt, under no clouds, ready to govern,

We've got a system in this country. It's strange, perhaps, and silly, and slow, we're realizing. But it's the system, and everyone knew it going into this election, and goddamit, it works. People cast votes, election officials watch the process and count the totals. Most votes in a state gets the electoral votes, and 270 electoral votes make you Preident. All questions to be resolved by the local election officials, or, failing that, but the court system, on up to the Supreme Court, if necessary. That's how we settle disputes in this country: by the rule of law. That's why we've got an independent judiciary, to handle stuff like this. There is nothing in the current situation that reeks of crisis: we're just going to crunch through the issues and come out with a President.

I have a little disdain for the folks who are suddenly calling the Electoral College undeomcratic. Not that they're necessarily wrong, just that, well, this election is going to be decided by the Electoral College. Shut up and deal, and abolish it for the next go-round if you want.

That said, the other folks who don't seem to trust the American system are the Bush camp, and their recent actions have me filled with outrage. Going to court to stop the recount isn't just absurd, it's actively dangerous. Not the "going to court" part: that's a decent, American, way to settle things, definitely a lot better than reaching for the guns. No, what I find repugnant is the idea that we should cut short the systematic determination of this election's outcome out of deference to various ill-defined psychological goals. "It would be good for the country to have this election over," Bush said, complaining that he is currently "in limbo."

We don't have elections in this country so that we can bring the campaigns to a natural sense of closure. We don't have elections so that the President-elect can pick his advisors between November and January. We don't have elections to determine the will of the people. We don't have elections to to psych the American people up behind their new leader-figure. We don't have elections for the sake of televising the various formalized rituals that characterize one candidate's coronation and the other's concession. We have elections to decide which person will be responsbile for discharging the duties of President of the United States, as specified in the Constitution.

The Cart Before the Horse?

The suddenness of the changes engendered by the Internet's arrival has frequently been taken for profundity.

On the one hand, the Internet is a technology for communication. The trouble here is that I don't see a lot more realized potential than untapped potential. The last two centuries' advances in communications technology have basically made, in theory, every significantly-inhabited point on the earth equidistant in communications terms. There's some room for extension -- higher mountain tops and further out at sea -- but we're not really going to bring New York and New Delhi any closer than they are already. There's room for bandwith expansion, but what are we going to do with that bandwidth? Whether people will ever accept virtual reality as a substitute for physical presence is an issue of psychology and sociology, not technology. And if they do, will it be connectivity -- the Internet -- or simulation - not the Internet -- that they choose? I don't know.

David Foster Wallace, in Infinite Jest, wrote the jesting retrospective history of the great videophone flop, and I think a lot of his points hold up pretty well: there are some distinct advantages to low-fi communications media, chief among them that one may actually be capable of more subtle expression over the phone, say, than in person, precisely because one's facial expressions are not on display. Technological possibility alone does not cause disruptive revolutionary adoption: the Chinese got along just fine with the printing press and gunpowder without having their world stood on its head by either.

On the other hand, the Internet is a technology for information processing. Since any computer can wallop data, we usually think of the Internet as combining data streams in novel ways, in allowing the rapid cross-referencing of information from multiple sources to extract interesting nuggets. Which is all well and good, but what does this really change in a fundamental sense? Businesses and governments will have better and more comprehensive information in making decisions, true, but behind almost every bad decision one finds not the unavailability of information but instead someone's willful ignorance of it.

The much-touted economic revolutions of the e-commerce era -- the Amazons and eBays and pure B-to-B plays -- all depend just as much on the existence of an advanced delivery and financial infrastructure as they do on the Internet. Amazon, in its bookseller role, offers you nothing you couldn't do for yourself with a copy of Books in Print, a telephone, and some patience. What it and its brethren are doing is optimizing the front-end, removing those slow and expensive people from the routinized customer-service tasks that computers can handle.

Which brings up my other major point of skepticism. The "knowledge worker" is supposedly the new hero of the Internet age, the one whose ship is coming in. And yet, paradoxically, it is these pure-information mavens, the ones who traffic entirely in abstractions and data, whose work is the sort of virtual twiddling that the machines were supposed to take over for us. In the long view, the knowledge worker is a passing phase, a symptom of the time when the heuristics and data-mining technology lagged behind the raw data-processing tech. We'll still need interpreters of data and inputters thereof, people to design and build the computer systems themselves, but the modern notion of a corporation as a modular and highly-adaptable highly-intelligent information-processing unit sounds an awful lot like what we're trying to train the computers to do.

Heap all the scorn you want, but the meatworld isn't going away: people gotta eat, people gotta have a roof over their heads, and even if the houses are built by machines, someone's still gotta build the machines and fix them when they break and keep an eye on them while they work. Humanity just isn't going virtual and disappearing in a cloud of bits any time soon. And if it does, then it won't be by virtue of any technology any of us could reasonably predict.

I think the real changes to watch for are the cultural and philosophical ones. Culture's malleability makes it a bit prone to change anyway, so it's perhaps not saying so much to say that Internet is going to cause some cultural changes, but I think the Internet is set to have a major long-term cultural impact. The centuries-long transition to mass literacy caused by the printing press and the twentieth century's more compressed creation of a richly visual culture, I think, significantly affected both humanity's experience of the world and its self-perception. In a very real sense, the Internet and its fellow-traveler computer technologies are starting to raise serious philosophical issues in ways that have major practical import.

One might talk about the ethical issues raised by genetic and medical technology, but these cases pale, I think, before the questions raised by computer and network technology. Our notions of intellectual property are being shaken right now by Internet-enabled technology, but these upheavals are really only symptoms of the collapse of certain assumptions that we used to be able to lean on to avoid having to actually deal with the real philosophical issues: what, exactly, is an idea, and what is its connection with tangible reality?

Or, on another front, what does it mean to be human? We've had an easy time of it, I think, because the only fringe cases we've had to deal with have been exceptionally smart animals, people with severe brain trauma, and thought experiments. Turing pointed out the reworking of our ideas of identity and personhood that computers would require over fifty years ago. That no one since then has advanced at all beyond Turing's original questions, I think, indicates just how far we have to go.

If nobody knows you're a dog on the Internet, are we about to radically transform what it means to be dealing with people? If the Internet makes place irrelevant, what is going to happen to our notions of home, of place-based identity? If every identifying detail about you is on the Net somewhere, are you defined by the information out there or by some unknowable and inexpressible hidden psychic core? What will it mean to be part of a community when that community is no longer defined by any concrete expression or action, and how will we relate to others in "public" life?

I recognize that a lot of these questions hint at precisely the sort of large-scale tangible change I've been so skeptical of. But I guess that's because I see possible changes such as "the collapse of the sovereign nation-state" as flowing from the combined changes in individual perceptions of the world, rather than the other way around. Commonly agreed-upon notions are what hold together our social and cultural institutions. Extrapolating the external changes and only afterwards inquiring about the internal ones seems to me a case of putting the cart before the horse, precisely because those subsequent external changes are being caused by the internal ones.

The interesting changes that will come of the Internet will be the ones that involve humanity's understanding of itself. Everything else is just a side effect.

Laying Down History

What's happening in these few days is that we're laying down an iconography. We're picking out the details and the the phrases that will fix these days in our minds. For every "where were you when" point in history, there are the accompanying details, the grassy knolls and bombs homing in on elevator shafts and schoolteachers and lone students facing down lines of tanks. So too, with butterfly ballots and exit polls, and retracted concession calls. These are the symbols that will stand in for this week, the telegraphs of memory.

The Madness Continues

From dinner today: "Be careful with the menu; you might order Buchanan by mistake."

I've identified the feeling: it's the feeling of the late afternoon of a snow day. Everything's on hold, and you don't yet know whether the next day will be a snow day, too. There's that dread in the air, there may be school tomorrow, the return to the unpleasant side of reality can't be postponed forever. Just a hushed waiting, and a feeling that what's going on isn't entirely real, and a lot of joking and jostling with your friends because you don't know what else to do when this sort of interruption springs out at you.

Political Perspectives

The Onion led today with the story "Bush or Gore: 'A New Era Dawns'". The article featured such lines as "Bush or Gore continued, 'And as a devoted family man with a wonderful wife and [two or four] wonderful children, I promise to make the White House a place Americans can feel good about.'" It occurred to me today that incredibly enough, the Onion, in printing the absolutely safest possible story, got it completely wrong. The coin really did come down on the edge today.

I went to a lecture given today by Jeanette Winterson at Benaroya Hall. During his introduction of her, the head of the sponsoring organization thanked us, the audience, for tearing ourselves away from the election coverage. He then pointed out the number of votes separating Bush and Gore in Florida was smaller than the number of people in the room. Try visualizing fifteen hundred people. Even set against my limited capacity to visualize two hundred and fifty million people, it's still stunning.

I've previously spouted on the subject of rational voting systems, and the impossibility thereof. There's no deep and philosophical sense in which either a direct-majority or an electoral college system is "better" than the other. They're both ways for collapsing a set of individual votes into a binary decision, but I don't especially think that a margin one way or the other of a smallish fraction of a percent in the popular vote is a philosophically meaninfgul basis for discerning some sort of will of the people, either at the national (popular-vote) or statewide (electoral college) level. A practical basis for actually choosing a government, sure, but one that's meaningful on its own, no, I don't really think so.

That said, I see two arguments against the electoral college, neither of which have I heard people making. First, you can make the same case against it that you can make against the states: we've drawn fairly arbitrary boundaries and tried to make them politically meaningful -- only they're less meaningful than most political boundaries, simply because even gerrymandered districts tend to have some relationship to local political differences. [For example, Washington and Oregon make sense as two states, but the dividing line should be the Cascades, not the Columbia River. That division would actually give us two reasonably coherent polities, instead of the strong urban-vs-rural dynamic that drives state politics.] Second, you can make the same case against the Electoral College that you can make against the Senate: it overemphasizes small states. Especially if you think (as I do) that the states are silly units anyway, there's no good reason to skew the voting power in that particular direction.

It's enormously strange to be watching and waiting. Nobody knows what's going to happen. Watching the coverage last night, it was clear that nobody knew what was going on -- the stations were contradicting each other, the vote totals flashed on the screen were full of obvious internal inconsistencies, some large fraction of what I learned was pure rumor and guesswork. Who knows where all this is going? And yet, also, nobody seems very concerned, which is what boggles me. Where are the every-fifteen minute updates to the news sites? Why was MSNBC the only network that was always running election coverage last night? Why did NPR cut away at 3AM? The candidates are radiating calm, but where is the nervous seething that they're trying to still? I get the sense almost that America is apathetic even about its political crises: the coverage has started to run to the "ha ha, look at the silly foreigners sending premature congratulations" variety.

Put it all this way: I'm calm, but I feel somehow that I shouldn't be.

Another Precinct Reporting In

Welcome to collective brain meltdown, America. I have no idea what to say.

A Bend in the River

When at last you speak, you speak of the inevitability of change, as though that had ever been the issue. Change is coming, you say, it's useless to fight and futile to predict. You're pleading, you're explaining, trying to make me understand about change. And this is when I understand that you aren't even talking to me.

I"ve been quiet. I've been busy. I've been earning my keep, proving my worth with the purely pointless. Roulette wheels, bowling scores, traffic cameras, airport codes, childrens' songs, quiz shows: been there, done that, figured it out. No jackpots for me, but if this is what being first loser feels like, I'm okay with that.

Extremity breeds change: call it hitting bottom, or being taken away from your everyday cares, or the sheer necessity of action, or a dose of perspective, or a gravitational slingshot. Call it whichever you please, but when I came back across the bridge and slipped off to sleep, it was all quite clear.

At work, I'm on a march towards zero, a gruesome hand-to-hand fight with the bugs, and that line has its perfect mirror elswhere. There are balances here to be maintained, conservation of creativity, constraints on the overall net flux of life force.

We war criminals like to rewrite the past; we slip away and make new names for ourselves. There are things you don't talk about, things you don't think about, not even in the silences of the evening. Doctored photographs, new identity cards, that was a long time ago, in another country, and when everyone in the village is complicit, memory stands on its head and looks to the future.

Times are tough in dot-com-ville, and the Lab is cutting back on its operational budget. We could stay here where we are and wait for the marketing momentum to die out, but where's the fun in that? That's not what I wrote on the napkin when I had the idea for this place. A laboratorium is a place for experiments, for mixing strange and scary chemicals, for proving and disproving theories. If you're not going to play with fire, then why'd you put in the fire extinguishers? Don't even get me started on the eye wash station and the fume hood.

Strange angels, singing just for me. Their spare change falls on top of me.

The Lab as we know it isn't going away, it's just not going to be my top priority, shall we say? It's all about the projects.

I wish I had something positive to offer up right now as proof of my sincerity, but things just aren't at that stage yet. Lotvs-Eaters is getting close, watch this space for updates, and that's going to be where it's at for a while. I'm not turning my back on weblogging or the Lab, no, precisely the opposite, but goals in the large necessitate sacrifices in the small, and, of necessity, I'm going to be a bit terse for a while.

Silence doesn't speak. This I believe, which is why I'm saying this much now.

Never explain. This also I believe, which is why I'm not saying more.

Existence is a con game, it's what we can get away with while the universe is distracted for a moment, and I have next to no idea what I'm going to try to pull under cover of this next bit of stage business, but I guess I'm going to find out, then.

I understand about change, believe you me. And if you spoke to me again, I know just what I'd say.

It's Election Day tomorrow. Vote Democrat and hope.

Tucked Under My Windshield Wiper


YOU GET MORE WITH GORE. eight more years of being lied to. And nothing has been done under CLINTON & AL GORILA. In fact the only thing has been done is leave the investors and the business people alone to do their jobs,business people did - look at the market. Clinton - Gore did not leave them alone because they chose to, but because they were busy defending themselves at the court house because of their to many scandals they have created, by lies, deceit, sexual assault against us.

You know, I think Gore is better than Clinton. He is a better lier, he is natural, he is a pathological lier, and that means a lot. It means he can deceive easily more people. He made all kinds of promises eight years ago, and he hardly kept one. Now he promises again. Clinton had to lie in order to defend himself and escape punishment by the Congress and to escape 'empeachment'. Gore lies and makes empty promises because he loves to, it is in his nature. It is the way he grew up. He calls himself a "Democrat". He is a democrat like I am from Venus. Ralph Nader from the Green Party is been more of a democrat than Al Gorial Gore for the last eight years / to the present, but by sticking to the truth, without making empty promises and more than anything else, being honest.

In Fact (talking about Gore) no one knows what his agenda is. He is more dangerous than PAT B. because no one really knows what he is up to .., in giving up to the big corporation gangsters and the Holly Wood gangsters, from whom Gore took big $ and promised that if he'l be elected he'l be there for them. He took the money from them in order to finance his campaign against Bush, but there is a price to pay, a price he will pay which leads to other scandals.

The truth is that crimes and violence has been up in the last eight years of Clinton - Gore administration. Rape of girls and women too. Thanks God I haven't been raped so far.
Truth can be painfull --> You know, I think I still prefer to be lied to for 8 more years (one think: I got used to). And then, I just know, 8 more years with Gore but as President, and nothing will change, just like the last 8 years when he's been vice-president
But the truth is painful, so I'll just believe like every body else or like whatever the media has to tell us. I know the media is corrupted but ....., most of us ar,

A Grist Bibliomography

Here's my version of a special treat for you Grist readers who clicked on the link to come over here: an annotated bibliography of my favorite computer books. Some of these books are for people who want to learn more about technology and others are for computer people looking for a broader perspective, but all of them are good reads.

The best book I know of on the computer industry remains Tracy Kidder's Soul of a New Machine. It's over 20 years old, but it remains one of the most understanding and sympathetic accounts of the kind of in which computers are produced. More recently, Fred Moody's I Sing the Body Electronic is a sharp and perceptive take on the sometimes chaotic development process at Microsoft.

Silicon Valley is a scary place, with some very scary attitudes. Michael Lewis's The New New Thing veers off on tangents at times, but it's a very understanding take on the crazed economy of wealth-worship in the Valley and its sometimes tenuous connection to the "underlying" technology and businesses. Paulina Borsook's Cyberselfish is more a opinionated (although her excellent writing makes the vitriol go down pretty easily) on the scarily libertarian culture of the Valley. She got a fair number of negative reviews, mostly of the form "This book is soooo 1995," but sadly, little has changed since then. Reading these two books makes it impossible for one to take seriously either the tech titans of the Valley or their critics.

As for why people go into computers in the first place, David Bennahum's memoir Extra Life and Douglas Coupland's novel microserfs both do an excellent job of capturing the sense many programmers feel that computing can be a human activity, full of joy, wonder, and connection to other human beings. Bennahum gets right the human drama of discovering computers; Coupland gets the human comedy right.

If you want to know about privacy (or the lack thereof) in the Internet age, check out Simson Garfinkel's Database Nation. If you want to know about security (or the lack thereof), look at Bruce Schneier's Secrets and Lies. Both of them know what they're talking about, and the picture they paint isn't pretty. Go lobby your elected representatives for EU-style databasing-disclosure laws now, and while you're at it, ask them to fix the commercial code so that software vendors can't disclaim liability quite so easily as they do now.

Last, and greatest, is Lawrence Lessig's Code, on the legal and social issues involved in regulating cyberspace. Lessig makes a very strong case for his claim that "code is law" -- that the design decisions made when programming computer systems wind up having the effect of laws that regulate the kinds of interactions people can have when using those systems. It's a completely obvious point, but it's amazing how many people don't get it (read the book, and you won't be one of those people). If we want the Internet of the future to have certain features, we need to make sure those features are designed into it from the start. Privacy, property rights, freedom of speech -- we're going to need to work to protect all these things, and the time to start is now.

Rage, Rage Against the Dying of the Light

Quoted from a Salon article on the Shrub:

That's when Bush told her that Yale "went downhill since they admitted women."

"When did you graduate?" Bush asked her, as she recalls. She told him. That's when Bush told her that Yale "went downhill since they admitted women."

"I said, 'Excuse me?'" Novick says. "I thought he was kidding. But he didn't seem to be kidding. I said, 'What do you mean?'"

Bush replied that "something had been lost" when women were fully admitted to Yale in 1969, that fraternities were big when he'd been there, providing a "great camaraderie for the men." But that went out the window when women were allowed in, Bush said.

"He said something like, 'Women changed the social dynamic for the worse,'" she says.

I normally don't just clip stuff other websites have said, but this one's provided as a public service. Consider it one last attempt to warn America.

Fun With Minor Celebrities

On the one hand, as with all celebrities, they have no idea who you are. On the other hand, unlike major celebrities, they're not used to the idea that they aren't expected to have any idea who you are.

It's lots of fun to say hello to them by name when you pass them on the street.