The Laboratorium
July 2000

This is an archive page. What you are looking at was posted sometime between 2000 and 2014. For more recent material, see the main blog at http://laboratorium.net

Takin’ Orders From Record Company Clowns


Some years ago, some friends and I were returning from an satisfying meal of ribs, lemonade, cornbread, and endless piles of pulled pork at Red Bones, where all dinners are epic. Down in the Davis T stop, there was a musician on the platform, playing guitar and singing in a hoarse voice with a very faint touch of some sort of unidentifiable British Isles accent. The music didn't really jump out at me at first, but one of the songs he played caught my attention with a lyric about the "staying up all night studying modern physics." Intrigued, I decided, right as the train pulled in, to buy one of his tapes. I wound up missing the train as a result and watched my friends gliding away, laughing at me. The musician, who went by the stage name, as it were, of Winterboy, was a bit impressed at the (unintentional) sacrifice I'd made for his sake, and played another "sciencey" song for me, this one called "Work and Homicide" which featured a whole raftload of really bad computer metaphors mixed in with one really good (if incoherent) image: the "cursor of Damocles."

I took the next train back, popped the tape in my player, and discovered that it was horribly distorded, warbling oddly and varying in speed from low synthetic rumble to chipmunk twittering. I pulled his business card out of the liner, called the number thereon, and left him a message explainng the situation. The next day, he returned my call and left me a message giving his T-stop itinerary for the next few days. So I hopped the T, went down to Park Street, found him working the crowd on the Red Line platform, and made an exchange, my bum tape for a good one. And I had my Winterboy tape.

His music? Well, I'd best describe it as "earnest." It has a strong streak of psychological self-actualization (he was a social worker before he got into the whole music thing) -- realizing one's true identity is pretty much the point of most of his songs, expressed through a variety of strained metaphors. But for all that, there are occasional flashes of genuine coolness here and there: "Work and Homicide" is actually a pretty catchy song, even if the lyrics are groanworthy, and a few others on the tape have stuck in my head over the years. At some point in the interim, he linked up with a drummer and bassist and got himself a blurb from the executive producer of the L.A. Music Awards. So Winterboy is definitely moving upwards in the musical world, at a pace that should put him on top of the charts by sometime early in 2378.

He did, however, through his web presence, point me at the Becky Chace Band, and I'm in perfect agreement with Winterboy's taste in recommending them. Their CD isn't out yet, but I've been listening to a few cuts from it at mp3.com, and their sound makes for pretty good toe-tapping programming music. Not necessarily groundbreaking, but catchy music, well-written and tightly performed, with heartening syncopation, solid harmonic structure, and genuine energy. The downside, of course, is that they're still a regional band, which means I'm going to be waiting quite a while before they play Seattle.

In that same category, we also have Grey Eye Glances , who have a serious record contract and can be found at your local industry-tool distribution outlet. A bit less of the straight-ahead rock, a bit more of the gift of melodic grace and winning inventiveness. I heard them playing at a free outdoor concert I happened to wander by, and I wound up ultimately buying both their major-label albums in order to track down the ballad they played at their concert that had me standing there in rapt attention. (for reference sake, it's "Angel" from off of Eventide, but the clips on their web site don't include the really great parts of the song, which builds as it goes). They're from out of Philly (somehow, I don't see them playing the Republican convention), and I'm sort of hoping that their forthcoming album will send them on a national tour, but I'm also not holding my breath for them to show up any time soon.

Which, I suppose, is why the good lord made Mah Jong,

Bad Names For Apartment Buildings



The Golan Heights
The Minnesotan
The Towers at Babel
The Citadel
74 Skid Row
The Grand Vizier
Alfred
The John Stuart Mill Apartments
The Glaxco Building
The Tenemental
The Benedict Arnold

Failing to Answer the Question


Got into a conversation yesterday about autism, specifically what the actual clinical definition is. Apparently, there's a debate raging over whether "mild autism" exists. One group maintains that people who show some autistic symptoms, but not to the degree associated with Oliver Sachs level cases ( i.e. people who exhibit some of the difficulty in dealing with other people and some of the characteristic physical tics of the autistic) should be classified as "mildly autistic." The other group holds this is an incorrect claim. They claim that since the "mildly autistic can be cured through therapy (and sometimes, medication), their condition doesn't qualify as autism, since autism is incurable.

I think the only useful reply to someone making that sort of claim is a good solid pimp-slapping. Not because they're necessarily wrong -- I'm no expert in the field, and I'm certainly willing to believe that the differences between autism and mild autism outweigh the similarities -- but because that kind of argumentation is so entirely unproductive. We concern ourselves with X. We define X in a certain way. We therefore ignore thing Y which does not match our definition for X. Which is all well and good, except that our exclusion of Y, unless we have done some extra work in the interim, says nothing about X or Y in and of themselves, only about our definition of X. That is, we have said nothing new about Y by drawing a line that excludes it, and it's better not to go around acting as though we'd learned something. If your definition of autism stipulates incurability, then you can't make interesting points about curability -- you're only allowed to leverage this definition in order to speak about other attributes. It's an issue of circularity, of having enough equations for the number of unknowns you're working with.

I first felt the need to go off on this rant while reading an account of the endless back-and-forth in the world of Austen critics on the whole lesbianism issue, only here I think both sides are being equally fatuous. One camp wants to label Austen a lesbian, on the strength of her strong bonds to her female friends, the discourse of "love" and "affection" among her women, notes how strongly certain traits fit with our contemporary understanding of certain lesbian personae. The other camp, vehemently "defending" Austen, points to the general conditions of English culture and claims that such things are typical, that Austen is in no way a standout, that the modern understanding of "lesbian" is only applicable in a modern context. They're both correct, so far as they go, but in order to score a few political points, they each wind up making basically the same interpretive error. The story here is not Austen herself; the interesting questions all concern the nature of the society she lived in. It's horribly incorrect to label her a proto-feminist lesbian writer and then hold her up as a shining example of courage and ahead-of-her-time attitudes, but it's no less incorrect to think of her as straight (in modern terms) either. Having set up a system of definitions for dealing with Georgian English sexuality, you can't then assume that a term taken from that system -- "lesbian" or "straight" -- carries over into modern usage. You could, quite legitimately, supply a set of definitions that cross cultural and temporal boundaries, a set of standards useful for looking at different social contexts. But such a set is useless for scoring the kind of cheap political points the opposing camps of Austenites are looking for -- by bringing past and present within a consistent set of definitions, you rule out the ability to shiftily transfer into a normative realm that sits outside your definitions -- which is why they don't bother with the exercise.

Also matching this pattern are such classics as the ontological argument, synthetic ethics, and most replies to Turing's test, and classifying ketchup as a vegetable. To the extent that there is a question, one's definitions may clarify or obscure it, but they will not ever actually answer it.

Don’t Eat That; You Don’t Know Where It’s Been


According to the web logs, Googlebot found the Laboratorium sometime about a week ago and has been gradually indexing the site since then, at a rate of about ten pages a day or so -- it's fairly clear that it's doing a breadth-first search, since every time it comes through and deals with a fresh batch, they're all referred by the same one of my pages. I was kind of struck to realize, though, that Google seems to go live with the changes more or less as soon as it notices the pages -- I can turn up Laboratorium subpages on Google now if I enter the right combination of search terms. For example, my candy report page is the number one site for a Google search on "Cadbury's flake", beating out even www.cadburys.co.uk. And my media rant shows up if you look for "Rush Limbaugh mp3 vcr", perhaps because I mentioned Rush in passing as yet another media figure caught up in a fight for control over the distribution channels.

What blew my mind when reading my logs, though, was the search that turned up my old musings about search engines. The second paragraph of that mini-essay reads

The Web, of course, never content to do anything by half measures, is mind-boggling when it comes to the vast realms of mindless entertainment it proffers. Given that it probably owed much of its early existence to the sudden availability of one-click porn it engendered, it has had its feet planted firmly in the realm of the recreational from the outset. But that's old hat these days -- what floats my boat are the entirely new forms of entertainment now available, forms never before avaiable, from the dawn of time down to this blessed day. You all understand random-walk web-surfing, you understand the voyeuristic fun of reading the drivel people decide they need to present to the world, and I'm sure some of you understand the exhibitionistic thrill of writing the drivel you decide you need to present to the world -- I know I do. Here on the Internet, the very state of having no life becomes the raw material from which the lifeless carve their amusement. Well, a couple of weeks ago, some friends of mine and I, sitting around flecking bits of metaphorical mud at each other, accidentally sculpted our own little AltaVista de Milo.

Seems harmless enough, no? Well, it turns out that the mildly close juxtaposition of "exhibitionistic" (in the third last sentence) and "couple" (in the last sentence) is enough to make the page show up if you type "couple exhibitionistic" into Google. You see, someone did type that into Google, and followed the link to my page, and was, I'm sure, quite confused not to find the naked flesh of multiple people on display. What really blows my mind is the final part of the referrer string, though: "start=90" Translated out of GooglURL-speak, this means that the Laboratorium is on the tenth page of links Google turned up for that search. Hit number ninety-five, in fact, as I confirmed by typing in the search myself. Ninety-five! Confirming everyone's suspicions about Internet porn-hounds, it would seem that whoever found the Laboratorium through this rather indirect route has a bit too much time on their hands (and probably other stuff, as well, but it's best not to think about that), to be clicking through that many links. That the end result is a page about the irrationality of search engines is the detail that makes the joke.

Today Will be My Moving Day, My Moving Day, My Moving Day


Started packing. I'm feeling intensely silly for having packed up the books and CDs first, thereby leaving myself without reading matter or digital audio entertainment. Oh, well. I'd be feeling much siller still if I'd started with the clothes, I guess.

Hide and Seek


I've added an important element to maintaining the aesthetic unity of the Laboratorium. Should you wonder what it is, or how to find it, allow me to reassure that it's easily accessible. Happy hunting.

On Timing, and My Lack Thereof


I am moving into my new apartment the weekend of the 5th of August. I am taking a long weekend the weekend of the 19th of August and going to San Francisco. So guess when the office move is scheduled for? Yes, that's right, they're giving us off the 10th and 11th of August while they move our stuff from the old building to the new, making that weekend -- a weekend, I should note, when I am neither moving nor planning to go out of town and hence might really like to have those extra two days -- a four-day weekend.

Mechanical Woes


My answering machine is a piece of crud, and I'd be quite upset at it if I hadn't deliberate bought the cheapest one in the store, one from a company not generally thought of as making telecommunications equipment. When it loses power, whether from me tripping over the cord or from something genuinely interesting, like power outages induced by trees falling in gale-force winds, it undergoes complete and total amnesia. It forgets the stored messages. It forgets the outgoing greeting. It forgets that it's entire purpose in life is to be the best damn answering machine it can be. It just sits there, a couple of red lights flashing at me confusedly. There's something a bit cute about its befuddlement, the way it sits there like an idiot waiting for me to push a few buttons to make it remember how to pick up the phone when it rings.

This much I'm used to dealing with; my greetings have gotten progressively shorter, stranger, and more resigned in tone as I have to rerecord them every few weeks. I've stopped bothering to teach it the time; all my messages arrive at 12AM Sunday now. This morning, it actually up and crashed. It stopped working as a phone -- the handset couldn't get a dial tone, although anything else I plugged into the phone line -- my modem, my old phone, even a banana -- could. Good old fashioned power-cycling did the trick, at the cost of yet another outgoing message lost and gone forever, one I'd slaved over for maybe all of a minute.

The New York Review of Each Others’ Books


The (loud booming voice) NEW YORK REVIEW (little squeaky voice) of books is one of my favorite magazines: its articles range from the boringly obscure to the profoundly insightful. For every Roger Shattuck there's a Garry Wills, for every "Gould/Lewontin: an Exchange" there's a Louis Menand stunner on the US's poltical culture. I've been reading through the 29 June issue, and although I don't think any of the articles in and of themselves are epsecially memorable, there were a fair number of details here and there that caught my eye.

James Traub has some very sharp things to say about the U.N. and its limits in a world less and less defined by national boundaries (see below for an extract). Ian Buruma, writing on Hollywood and its fascination with Tibet, had the following to say:

Those who felt discontented with their own complicated lives were consoled by the idea that in one isolated spot lived a people who still heled the key to happiness, peace, and spiritual salvation, who had, as it were, by some miracle of nature, been spared the expulsion from the Garden of Eden . . . .[Orville Schell, author of Virtual Tibet] expressed a fleeting sense of nostalgia for an earlier China, austere, remote, high-minded, inaccessible, xenophobic, poor. Mao's China, after all, was a kind of Tibet for would-be refugees from Western civilzation too.

And Robert Darnton, discussing the spread of rumor in pre-Revolutionary France, opens with this declaration:

The media loom so large in our vision of the future that we may fail to recognize their importance in the past, and the present can look like a time of transition, when the modes of communication are replacing the modes of production as the driving force of history. I would like to dispute this view, to argue that every age was an age of information, each in its own way, and that communication systems have always shaped events.

Yet Another Example of the Porousness of Certain Borders


As a programmer, I find that one task that keeps coming up in my professional life is that of properly setting up abstraction boundaries. In writing up code to carry out some complex task, the imperative of accumulated folk wisdom is to strive to divide that complicated job into some sort of relatively simple interaction between relatively simple parts. The philosophy is the same one animating mechanical engineers to minimize the number of moving parts in their designs: everything that ought to move could, in some scenario, fail to move, and then where would you be? Intellectually, though, this subdivision has other implications for programming: the mental layering involved in this black-boxing is generally (although not universally, and I've heard some good arguments against) held to be a good thing. The point is that from any perspective outside one of these components, the internal structure of the component itself should be irrelevant. Whatever hidden gyrations it goes through to carry out its business with you are its own business, nor should it care about your own gyrations. The interface is a contract, but it is also a wall, and if the prisoner on the other side tapping out messages in morse were to be replaced by some other inmate who taps out the same messages, no other features of their life, their past, their tortured thoughts, should concern you.

The fear at the back of your mind, though, isn't that one of your components is taking advantage of its abstraction barrier to torture small children without your knowledge. No, once you accept that walls are good things, the question is really whether you've put the wall in the right place. Should it be ten feet further over, or maybe rotated by thirty degrees? Does the caching code belong with the protocol handler or the renderer? If we put the retry logic in the controller, we save on the double round-trip for failures, but that means that the controller is making certain assumptions about the batching logic. People fight holy wars over this stuff, and few things can be more frustrating to a programmer than needing to hack on a codebase whose abstraction boundaries weren't set up cleanly: there's never a good way to do what you want and you wind up making small changes to a hundred places, rather than one medium change to one place.

This sense of disquiet carries over from programming into life -- I get this sense in my gut every now and then looking at something from the news, wondering whether the particular problem or plight that sounds like the result of historical forces or some callous decisions are really just the well-disguised results of ill-established categories. I got thinking about this reading James Traub's New York Review of Books article about Sierra Leone, entitled "The Worst Place on Earth." Towards the end of his profoundly depressing article, he observes that one of the reasons for the collapse of the UN's peacekeeping efforts in so many places is that

For one thing, peackeeping wasn't designed to stop warlords like Foday Sankoh -- or anyone else for that matter. It was designed to help carry out agreements among states . . . But countries don't go to war with one another as often as they used to. We live in an era of collapsing states: and now governments declare war on factions, often ethnic, as in Kosovo; or factions try to murder their way to power, as in Liberia and Sierra Leone; or in the absence of any state at all, warlords fight each other for supremacy, as in Somalia.

This, I think, is a really good point. The philosophical underpinnings of the UN are of agreements among autonomous nation-states, and in some sense it is possible to see the UN's failures exactly where these assumptions break down. During the Cold War, anywhere that the US and USSR were closely involved the UN generally had to step back from: the intensity of the interests of the superpowers tended to make smaller domino-countries blur around the edges and lose their individual voices. A lot of the ills of modernization, I think, can be chalked up to the inadqeuacy of nation-centric theories of development in an economic environment dominated by multi-national corporations. And then there's Traub's point itself, that the UN is profoundly ill-equipped to deal with what seem to it like nations waging war on themselves. In fact, the bias towards nations accounts for one of the strange oddities of power-sharing arrangments and internal struggles for capitals: the power to speak with the institutional voice of a nation is an intangible prize but a valuable one. Diplomatic recognition, like code books and personal seals, is one of those interesting informational spoils of war. In programmer terms, the question is whether the nation interface is the right level of abstraction to deal with, well, whatever it deals with.

Here at home in our own anti-abstract nation, political parties have been making me scratch my head in the same way. Parties are granted some pretty strong priveleges by the government: the whole primary-general system is designed around a system of parties, and a party's performance in one election determines a great deal about what resources the government will allocate to it in the next. What makes this highly unnatural is that the parties themselves are not the units about which the electoral process makes its decisions: we vote for candidates and the elected officials govern, but it's the parties who structure the selection process. Proportional parliamentary representation appeals to me for other reasons (which I won't get into here), but it also possesses an intellectual clarity our current system lacks, in that the structure of the process matches the options available to the voter. Or, in the other direction, why not eliminate the parties as formally-recognized units, and stop having our primary system coddle to their interests? The bizzare gyrations the Reform "party" has gone through, to me, indicate the intellectual paupery of the whole party concept. The parties as voting blocs would probably still survive, and most elections would pit a Republican and a Democrat, but the face of soft money would be much changed, and some of the distorting effects of our de facto two-pary system would be ameliorated. Again, whether or not you think the barrier should be moved, the point is that we have put up a barrier here and that this choice of place must be understood to be a somewhat arbitrary one.

Another fought-over boundary is the division of corporate boundaries along a vertical supply chain. I've seen a whole bunch of articles about e-commerce that dwell on the nature of the e-tailer and its relationship to the physical fulfillment process. The e-business is free of the messy business of actually stocking and shipping items that plague real-tailers! The e-business can only be built upon the massive distribution infrastructure built up by UPS and FedEx and a nationwide supply of warehousing know-how! The e-business must exercise precise control over its fulfillment and massage the process to custom-match its customers' demands! The e-business should be a true virtual business and should outsource everything that weighs more than an electron! People talk about these questions as though they were deep and fundamnetal issues affecting an e-business, but this is the wrong frame: these are issues that pertain to an ecology of companies, not to any company alone. When I go to a web site and order something and it shows up at my doorstep, certain things need to take place. There are computer-related tasks: the interface presented to me, the billing and communication with me. There are industrial tasks: someone actually needs to manufacture the World's Best Salad Spinner. And there are distribution tasks: someone needs to bring it to my door and await my signature. The division of these tasks among corporations, from my perspective as a consumer, is entirely irrelevant. Decisions can be made different, with different consequences, but I don't think that any given model is necessarily right or wrong. If an e-business controls its warehousing, then it has the ability to respond to issues on a small scale and to optimize its process for the particular patterns of its business; if it leaves the distrubtion to someone else, it sloughs off a lot of headache and cuts its operating capital requirements. But to the extent that the system has to stock and deliver items and to take orders, the abstraction boundary embodied by the extent of a coporation is an unnatural imposition, an artifact of something not intrinsic to the problem. Talk about "core competencies" all you want; there may not be much of a difference between two companies and two divisions of the same company when it comes to how they do at satisfying their customers. Where things get ugly is in how they deal with their competitors -- but again, the barrier between two competitng companies is another one of those abstraction boundaries that might not, perhaps be sited in the "logical" place.

In that last argument there's one other scientifically-inspired habit of thought peeking out: the conservation argument implicit in the switch from talking about companies-that-do-things to things-that-get-done-by-companies. The point was that the set of tasks was the same, whether we grouped them but what sort of tasks they were or by what companies carry them out. To a mathematician, it was a rearranging of the terms in a sum; to a physicist, an application of Gauss' Law; an algorithmist might recognize the insight behind amortized analysis. To the programmer, this back-and-forth flip is second nature -- every day we need to switch from thinking about components and their interactions to thinking about the flow of a computation as it jumps from one component to another. Source files and class definitions reveal the former mode of thought; program traces and debugger sessions exhibit the latter.

There. I knew there had to be a reason why I got into this business.

The People Passing By


At the Bite, we were sitting on the edge of the fountain, chowing down, when a pair of local youth approached us. One was holding out a Frisbee with some coins collected in it. The other had her hands pulled inside her sweatshirt sleeves and was holding on to a fork with each. Her accomplice explained that she'd lost her hands and that if we could spare even one single cent to help her obtain prostheses, our help would be most appreciated. As she was bewailing the pathos of lacking opposable thumbs, I saw my opportunity. I volunteered the skewer from the chicken I'd just eaten, which was the same thickness as the tines on the forks. She went dancing around shouting about her new thumb, while I fished out a penny and dropped it in the Frisbee. And they went on their way.

Then, after the Mah Jong show, I was walking down the Avenue back to my car when I passed in the opposite direction to a bunch of people carrying what looked like flyers, giving one to each person they met. When they passed me, I dutifully took one, and noticed that it was oddly creased. It turned out not to be the ballot initiative literature or the Christian proselytism I expected -- instead, it was the transcript of a game of paper telephone. You know: first person writes a sentence, second person sees only the sentences and draws a picture of it, third person sees only the second person's picture and writes a sentence description of it, and so on, alternating words and pictures around the circle. This particuar one managed to get from "When I was a kid, my momma wouldn't buy me shoes. I had to wear cracker boxes on my feet." to "The Olympics brought out the amateurs and profesionals alike. Shari just used her hips to sway with the crowd's shouts of joy and admiration."

I love this town.

Clapping and Stamping


Took in the Mah Jong show this weekend at the Rain Dancer, on the advice of Jon from the office, who was raving about them on Friday. It was a truly inspiring event. Mah Jong self-describes its music as "Totally hilarious all original funk/disco dance music," which isn't far off the mark. They play super-high-intensity funk with a very energetic stage presence and a decidedly offbeat lyrical twist. As a band, they basically have two settings: "high-energy" and "higher." This is a band whose two frontmen have to use headset mikes because they bounce around so much -- and this is on a stage maybe ten, twelve feet wide.

Mah Jong consists of three white guys, two of whom are grad students in philosophy at UW, and Yoshi the guitarist. Yoshi doesn't say anything, he just stands there and plays incendiary funk guitar and grins occasionally. The bassist and keyboardist make up for Yoshi's reserve with their full-steam-ahead assaults on Mah Jong's crowd-pleasing standards, songs like "The Heimlich Maneuver" (complete with lyrics and dance moves that provide a demonstration), "1040ez" (yes, about the tax form), "This Cavern is Very Exciting" (imagine Ren's cousin Sven playing disco funk, if you can).

They didn't play "Gubmint Cheese," their anthem about US dairy subsidies, but I can't resist quoting it here:

Comes from the food bank in big long block
Clogs your arteries makes your heart stop
I don't know whether this is real cheese or not
No expiration date anywhere on this box

Bite of Seattle


It bit. It was like going to an enormous food court, but where each of the individual restaurants has a much smaller selection. I have much better memories of the experience from the last time I went, four years ago. I guess in the intervening years I've passed some kind of inflection point on the curve of new food experiences, so that now I actually know what to expect from most of the sorts of places that showed up to the Bite, and the thrill of discovering genuinely new kinds of food has gone away as my wordliness in the ways of eating has grown. The eight-dollar "critic's choice alley" was mostly seafood, so I gave it a pass, and wound up having a fairly awful blue cheese salad, some just barely okay chicken skewers, some decent garlic fries, and a disappointing chocolate fondue. The only really good thing was a Hawaiian shaved ice, which was basically a Sno Cone with higher-quality ingredients. On the positive side, the company was good, and the day was cloudy so the crowds were bearable.

Try This at Home


Steve taught me a great game, one that requires, as he put it, "being in a silly mood." You take a Trivial Pursuit deck and pull out two cards, holding them together as though they were one. Then you look through the question-and-answer pairs this combination of mismatched sides creates, and you discover some howlingly funny "questions." It works, he observed, because the categories match up, so that the answers are usually the right sort of thing, but often hilariously wrong. What two languages are the official languages of Vatican City? Why, India's, of course. What does the number of protons in a nucleus determine? Wind! And what talk show host is famous for his red suspenders? Clint Eastwood.

Songs of Innocence and Songs of Experience


With age, there come certain disappoinements, and high among them is failed nostalgia. It's a real letdown to be reminded of something you deeply loved as a kid, only to discover that not only can you no longer abide it, but you can't really understand what you ever saw in it. The particular feeling of happiness associated with it has become inaccessible. I had this experience today with the Johnny Horton song "The Battle of New Orleans." Let me tell you, I used to love that song, especially the part about the gator filled with cannonballs. And now, nothing. Horton has a bad voice and doesn't sing very well, the tune is actually a bit annoying, and the lyrics are nowhere near as catchy as I remember them. Sigh.

Say it with me: ewwwww


Sometimes I get obsessed by particular gross things I could do to myself, but fortunately don't. All today, I kept on thinking about putting a quarter in my mouth and flipping it over and then eventually choking on it. Don't ask me why. It was just there, poking around in my head and fluffing the pillows. Last month, it was a similar deal with a dime and my nose. Let me state, for the record, that I have utterly no intention of sticking coinage in any of my orifices. Perhaps it's the revulsion I feel that makes me dwell on these ideas. Back in elementary school, I went through a phase when I was afraid to color with crayons. Why? Because they had bright colors, which reminded me of fruit, but if I tried to actually eat the crayon, it would be disgusting. So every time I looked at a crayon, I'd get this waxy taste in my mouth. This actually escalated to the point of being a serious hassle, and one of my teachers had to do the whole authority figure thing to force me to start coloring with crayons again -- I was trying to do absolutely everything in pencil (in fact, with the same oversize pencil, which I had found in a desk at some point earlier in the year and become strangely attached to). Yeah. Weird.

Fools’ Gold


The 17 July issue of The New Yorker contains an article by James Collins on a class of investors he calls "gold bugs," who hold that, in the final analysis, "gold is money, and only gold is money." Everything else is either a speculative bubble not properly built on a gold foundation, or the wonderful results of a financial system properly backed up by good ole gold. Gold is the only investment that will hold its value when the Big Crash comes; the accumulate-and-hold approach to investing in gold will bring better and more guaranteed long-term profits than any other possible investing strategy; the world would be a lot better off if all of this funny business were properly tied back to the transfer of gold it really represents.

In a word, these people are nuts. Gold is no more the "real" money than anything else we use as currency, and in some ways, it's less so. Thomas More summed up the anti-gold case pretty well in his Utopia:

It is certain that all things appear incredible to us, in proportion as they differ from our own customs. But one who can judge aright will not wonder to find that, since their constitution differs so much from ours, their value of gold and silver should be measured by a very different standard; for since they have no use for money among themselves, but keep it as a provision against events which seldom happen, and between which there are generally long intervening intervals, they value it no farther than it deserves, that is, in proportion to its use. So that it is plain they must prefer iron either to gold or silver; for men can no more live without iron than without fire or water, but nature has marked out no use for the other metals, so essential as not easily to be dispensed with. The folly of men has enhanced the value of gold and silver, because of their scarcity. Whereas, on the contrary, it is their opinion that nature, as an indulgent parent, has freely given us all the best things in great abundance, such as water and earth, but has laid up and hid from us the things that are vain and useless.

Aside from a couple of quibbles (man's inability to live without iron is literally true only in a rather narrow technical sense, having to do with hemoglobin), More basically lays it on the line. I'd like to pull out some of these themes a little further to make clearer what he calls "the folly of men."

First off, the claim that gold, existing in and for itself, has any sort of intrinsic value, is complete hokum. Value is a human concept; it's only meaningful in relation to some person and their wants and needs. If intelligent life disappeared off the face of the earth tomorrow, the contents of Fort Knox would have no meaningful value, high or low. It takes a person, a person who eats and drinks and likes to sleep out of the rain and who likes pretty objects, in order to say that food or drink or iron or gold has some value. These values may vary -- you value food less after a meal, and your artistic tastes change with time, and so on -- but the point is that value derives from the valuer, from your decision that you'd rather take a bath right now than go to a movie, that you'd rather have a hammer than a nail. Even the fallback argument, that gold is intrinsically valuable to people, is highly fishy. That's the Midas story: you can't eat gold, you can't build a house out of it, you can say all you want to it but it won't say much back. It's shiny and can be worked into nice shapes, and that's about it. I just don't seen an absolute case here.

As for the argument that gold is the only true money, well, that stems from a fundamental misunderstanding of the nature of money. Monetary value -- the kind of value reflected in the statement that "only money backed by gold is worth anything" -- is socially constructed, and there's nothing to show that society will necessarily construe gold to have monetary value, and nothing else. Money, as Homer Simpson says, can be exchanged for goods and services. That's what money is, that's all that money is, and gold enters into that definition nowhere.

Let's perform a little thought experiment. You and I, let us say, have met on the street, each of us carrying a sack of stuff. We open our sacks and display our goods, and being covetous of our neighbors possessions, we decide we'd like to engage in a little friendly exchange. I want your teapot in the shape of Moe Howard's head, and you want my Star Trek commemorative plates. You're all set to make the trade, but I don't want the teapot as much as I want the plates, so you offer to throw in a Chewbacca tie, at which point both of us agree we'd be better off making the trade than we are at the moment, so we shake on it, hand over the swag, close up our sacks, and go on our way. We've just gone through a perfectly acceptable barter transaction, built upon our respective values. Just to check, nope, no gold.

Where money comes into the picture is as a bookkeeping device, a way of simplifying the discussions we take part in to reach our final deal before we head along our respective paths. The process of fitting together these packages of goods that fit into mutually-accepable trades is a complicated one, combinatorially ugly, and requiring us to be weighing in our minds the values of strange and arbitrary sets of stuff: would I rather have the teapot and the tie and a bag of jelly beans, or the plates and the statue of Leonardo da Vinci made entirely out of elbow macaroni and the laser-powered elephant repeller? But if I introduce into my head some sort of abstract count, say that the teapot is "worth" eight and the tie "worth" two and the plates are "worth" nine, then (assuming independent valuations), I can factor my end of the bartering into a simple addition problem, and I'm willing to go ahead with any trade that gives me more of these abstract units than I give up. If you make a similar calculation, we don't need to be trying to separately compare every possible trade we could make. We can make partial trades -- the teapot for the plates, but I feel that I'm a behind on the trade, and you're a bit more ahead than you need to be to be happy -- and then separately make other trades to settle accounts -- that tie is worth less to you than your excess of these abstract units, but it would make up for my shortfall. Again, gold has nothing to do with it.

To the extent that we can compare our counts, we might as well bargain out loud using them. I give you the plates for nine credits, then you give me the teapot for eight and the tie for one, and we close up our sacks and go home happy. To an outside observer, the individual trades don't make any sense -- why did I just hand you the plates? -- but when we part ways, we've done exactly what we would have done under a barter system and the trade makes perfect sense. These "credits" are a fiction the two of us share for the duration of our trading session. They're meaningless outside this context, which is why they satisfy this seemingly bizzare conservation law in which the total net quantity that changes hands is zero. Whoever has a surplus needs to trade them in for goods and services before we part.

We could, perhaps, agree to use some sort of convenient marker to stand in for these abstract units, carved pieces of wood, say. This has the immediate advantage of concretizing our trades and making for easy visualization of what needs to happen in order for us to reach an overall agreement. It also has the more long-term advantage that we could extend our "session" beyond this encounter. I could hang on to those extra pieces of wood, and give them to you for the fish you just caught, the next time we meet. Or, if we got someone else to go along with this convention, maybe I could give them to someone else for his fish, and he might give them to you for a Three Stooges poster. Again, the individual trades look insane, but the whole follows a perfectly sensible pattern. The tokens are worthless in and of themselves, and are valuable only as they participate in a closed system of exchange, going around in circles opposite to the ones the goods are going around in. The tokens stand for abstractions, and those abstractions in turn are meaningful only as potentiality, the potentiality of being exchanged for things that really have value to you and me and that other guy behind the tree. This is the point at which the gold bugs make their key mistake, I think. Conflating the monetary value of gold-as-token and the (lesser) "intrinsic," decorative, value of gold, they assert that the abstractions of currency derive their value from their representational relationship to gold, the backer of that currency, a leap which is wholly wrong. The thinking is that on that day when everyone "cashes out" of the imaginary system and converts back into the concrete tokens, those who are left holding the gold have the value. And sure, they do, to the extent that they can make pretty rings and chains, but the monetary value of the gold is entirely gone, it disappeared in the moment when people stopped crediting the monetary abstractions.

It's all a huge shell gamae. This fact makes many people deeply uncomfortable, but their discomfort does not make it any less true. The whole system of exchange works because we expect that it will work. I'll accept your carved pieces of wood today because I expect that tomorrow someone else will accept them -- my expectation that the system will work tomorrow is what leads me to do my part to make it work today. On your end, that I accepted them from you today makes you more likely to take someone else's tomorrow -- your current experience with the system working makes you more likely to perpetuate it in the future. It's terrifyingly circular, especially when you realize that this is exactly the same mechanism that pyramid schemes and all those other illegitimate instances of investment rely on. My reaction to reading Irrational Exuberance was to realize that the question is not whether the stock market is overvalued, but whether the stock market can be said to have any meaningful objective value. Money and investment literally are confidence games -- collective enterprises held up only by people's confidence in them. What we deem to have value has value, and this deeming need not be causally correlated with anything that would be "valuable" in the absense of this collective deeming. It's the human investment that matters.

For an example, consider Sony's decision to ban Ebay auctions of items from their massively multiplayer online game, EverQuest. People were playing EverQuest, acquiring virtual items of some worth within the game world, selling them to the highest bidder on Ebay, and then arranging to meet up in the online world and hand over the items. Leaving aside the philosophical and aesthetic issues of these crossings to and from cyberworlds (a topic which fascinates me for other reasons, but this rant isn't the place), people replcated in this online economy all sorts of interesting features of the real one: I think that the "currency" of EverQuest gold pieces passes every single test for a "real" currency. It's built upon people's collective senses of personal valuations, people are willing to exchange it for things they value more directly (magic items which enhance their game-playing experience), and the system of value it establishes is held up by people's confidence that the EverQuest economy will keep on churning.

Some enterprising folks found the "border" to this world and set up currency exchanges, ones which act just like real ones. The EverQuest money stays in the EverQuest world, and the US dollars stay in the meatworld -- each a happily-functioning economy with its own circular flows -- but enable a broader joint economy, in which the money I got from my job (for my labor) goes to an enterprising young fellow who plays too much EverQuest for a cloak of protection he bought for six thousand virtual gold pieces which he acquired from spending lots of hours playing the game, and he then uses my money to order in a pizza. It's a kind of ultimate alienation of labor from the products of that labor, which in this case don't even exist, and I'd love to see what the Marxists make of this one, but more relevantly, given the many strange financial instruments and e-cash schems and electronic scrip and online banks and so forth kicking around these days, I challenge you to say that those six thousand virtual gold pieces are any less real than the money which gets direct-deposited to my checking account and then electronically transferred to my credit card to pay off my order at an online merchant. The EverQuest world isn't as sophisticated nor is the economy quite as reliable, and the trading to and from its currencies is trickier (especially now that it constitutes a black market of sorts), and the confidence holding it up is less rock-solid, but these are distinctions of degree, not of kind.

Or, consider the devaluation of the NASDAQ this spring, in which many billions of dollars just vanished, as though into thin air. As the stock prices tumbled, the collective valuation of the NASDAQ, as measured in the total number of shares outstanding times the marginal exchange rates between those shares and US dollars, plummeted. Again, people are uncomfortable that this money just "disappeared," and some of this discomfort spills over to an existential uneasiness about stock and other "paper" wealth, which seems to come and go with no accounting for itself. I think that the key observation here is that the total "value" of the NASDAQ was, and still is, and allways will be, far more than the amount of money that could ever possibly be withdrawn from it. You simply couldn't sell all the shares; in fact, you can't remove more than a very small fraction of them before the price, and the "value" of the remaining ones starts to drop. The value is accumulated in the system, in its liquidity, in people's willingness to believe that others will be buying and selling NASDAQ stocks tomorrow, and the day after. When you and I close up our sacks to go home -- and never to meet again --, the "value" of our carved pieces of wood evaporates, and we become that much "poorer." It's the same reason that we don't really become any richer if each of us carves more sticks, or why governments can't just print all the money they need to buy absolutely everything they want: it's the exchange-value we care about, and the system is only going to tolerate only so much abuse around the fringes before the value this abuse cannibalizes just disappears. What happened to the NASDAQ was a certain decrease in the overall confidence of its investors, a retreat in their expectations of what kinds of exchanges they'd be offered in the future in exchange for their abstractions. And this decrease in expectations alone causes a present decline in prices, and undoubtedly fueled even more of the decrease in expectations. There really is no way around these positive-feedback systems; they're intrinsic to any financial system, the sword by which it lives and therefore also dies.

The point is that any system of money is held up by this trust, indeed depends upon it far more than it depends on any "objective" criteria for its stability. Those objective criteria are important only insofar as they get the system rolling, provide it with a useful initial critical mass. This is how gold got its start. It was shiny, it was pretty, it was wholly superfluous and therefore something of a symbol of luxury, and its supplies were sufficiently restricted -- no matter how much anyone tried to alchemize some up -- that it was difficult to inject more gold into the system. The intrinsic decorative value gave it a bit of a foundation even when trust relations collapsed, and the scarcity enforced the closed-system assumption necessary to produce trust in gold as a record-keeping token of the monetary abstraction. But it's ridiculous to hold gold responsible for all the early success of money; something else would have taken its place -- and often did -- if gold wasn't up to the task, and it was the confidence relationships which made the system work, not the shiny yellow metal.

The gold standard, even at its prime, was a non-issue, a debate over the tightness of fiscal policy shoehorned into a slightly inappropriate form. Gold backing was a tool of confidence, a way of spreading reassurance about the worth of money -- but the reason it worked was not that the gold backing itself was valuable, but because it was a public tying-of-hands. With fixed quantities of gold kicking around, a gold-backed currency was subject only to certain limits of manipulation, a bank backed by gold deposits could engage in only so much funny business. You could, in theory, demand a complete reckoning at any point in time. To actually do so would destroy the system, but the possibility of the chopping block and subsequent cavity search, one might say, keeps the goose laying those golden eggs.

But what about the mythical dimensions of gold, the accumulated legends and cachet surrounding it? Well, on one level, they're just another positive-feedback confidence game: gold is valuable because it is fetishized by so many, and it is fetishized because of its absurdly disproportionate financial value -- except that, as I've been arguing, these "disproportionate" values are anything but, and that if we're going to take all these other currencies at their obverse value, we might as well take gold seriously, also. Gold has very effective PR agents, true, but given this whole argument, does this mere fact invalidate their claims? Yes, and no.

There is a very subtle distinction to be made here. Gold is not fundamentally different from any other financial instrument, its worth is no more or less socially constructed. But the claims made for gold are different, because gold is claimed to be unique among commodities and currencies: gold is the One True Commodity, the Currency of Currencies, the Monetary Messiah, a claim which I argue is demonstrably false. The value of gold, though, is held up by this rhetoric in all its falsehood. The confidence scheme underlying gold is more cowardly, more ashamed of its own nature, than other similar schemes; one runs into internal contradictions sooner if one tries to work within the self-referential system of the gold bugs than if one casts in one's lot with the T-bill or the blue-chips. To formal logicians, the distinction is exactly that between Rosser's theorem and Lob's: currencies and logical formulae founded on the truth of their own success have slightly more going for them than ones which predicate the falsity of their logical foundations. I don't have anything against gold for its own sake, but I think auric exceptionalism fails just as badly as American exceptionalism, and for much the same reason: things are more similar than people like to admit, and the absurd is always closer to hand. Better to deal with the elephant in the living room than to steadfastly insist that elephants don't live on this continent.

A bit of a postscript. I realized while writing this what it was that I think ultimately kept Cryptonomicon from being the novel it could have been (or perhaps symbolized the novel it was): Neil Stephenson's adolescent fascination with the gold bug position spills over into every part of the novel and keeps it from ever really rising above the adolescent. As every character gets caught up in the wild chase after an obsecne quantity of gold, and as the scenes of high adventure involved in this case grow increasingly over-the-top, the novel feels more and more like something out of a book of adventure stories I read as a teenager, which had all the classics, like "The Most Dangerous Game" and "Lennigan vs. the Ants," except that it's two orders of mangnitude longer and written for geeky teenage boys instead of just teenage boys. Something about the gold cachet is wired straight to the Treasure Island instinct and in the same way that the obsession with the gold bug theory of monetary value undercuts the intellectual force of Stephenson's thoughts on data havens and cryptography, the obession with gold itself undermines any maturity he tries to build up in the pacing of the novel's events. Everything devolves into an almost stereotypically masculine gee-whiz rush for the gold, complete with cartoonish violence, and Cryptonomicon winds up being just a ripping good long yarn, instead of the great breakthrough geek novel it might have been.

Massive Cussing


Lost two hours worth of work on that extended rant about gold due to an unexplained system crash and some sort of brain-dead caching scheme that hadn't actually propagated my changes out to disk, no matter how many times I'd hit "save." I should be able to reconstruct it tomorrow, but man, this is demoralizing.

Site Tweaks


Added a Geegaw link to the navbar. Moved a couple of files that used to be served up externally (both the HTML and Word 95 versions of "That Miracles are Ceased," along with all the PostScript-formatted documents). Considered revamping the color scheme, decided against. Added a minor little treat for all you IE users out there. Was going to write about gold, but then I wound up rereading a bunch of the second Harry Potter novel, and then I had to go and compare scenes with related scenes from the fourth, and whoa, did the time ever slip away from me.

Escape Hatch


In a stunningly under-reported turn of events this past week (the only good account I've been able to find was this story), Orrin Hatch, senator from Utah and musician, decided to threaten the record labels with rewriting US copyright law to explictly lay out generous definitions of "fair use" for digital media content. Hatch, the principal architect of the Digital Millennium Copyright Act, castigated the labels for failing to make their music easily available online. If you haven't responded to the DMCA carrot, he basically said, perhaps you'll respond to the stick of a "clarification" which enshrines in law all sorts of practices -- making a copy of your CD to listen to in the car, trading tapes with your friends, ripping your CD at home and listening to the MP3 file at work -- that you've been wailing and moaning about forever.

Vermont's Patrick Leahy put the argument in terms of the political pressure 20 million Napster users could bring to bear, suggesting that if their Napster access were cut off and no reasonable alternative provided, the Senate would be dragged kicking and screaming into the Internet age by the groundswell of protest email. I think Leahy's wrong on this one -- if you asked me to name a more politically inactive constituency than that of typical Napster usuers, I'd have to go either with with the severely impoverished or those too young to vote. My sense is that Hatch isn't particularly responding to grassroots political pressure -- the free-music camp just doesn't have the kind of muscle that turns gears in Washington, nor has there been the kind of crazed all-over-the-place faddism that generally otherwise inspires Congress to act, even when there isn't money per se at stake. I think Hatch is going personal on this one, both as a small-time musician and as a legislator who feels betrayed on the deal implicit on the DCMA.

And though I'm not really a fan of the recent American trend in which every abstract issue must be reduced to the most crassly individual terms and every act rooted in purely personal considerations, principles be damned, (cf. The Patriot as a case in point), in this instance I have to say I feel some admiration for Hatch, who in ordinary circumstances is one of my least favorite politicians (going back to his shameful behavior during the Clarence Thomas confirmation hearings). His personal involvment has, I think, led to him to what I see as one of those wonderfully simple anti-Jephthaic realizations: don't like the way the law is making things evolve, well then, let's just change the law and fix things. A lot of discussion of the whole Napster situation has focused on social decisions about Napster itself and tried to calculate the respective consequences of deciding to encourage it or to prohobit it. But this is going about things exactly backwards: we should be asking what sort of a media regime we want, and then going about determining which decisions will bring us closest to and which will leave us furthest from. What better place, in fact, to start, than with the fair use clauses, whose notorious ambiguity, almost everyone agrees, has been the source of so much trouble? The greatness of Hatch's suggestion is that he's treating fair use as a means towards social ends rather than as an intrinsic good with a paricular worth that must either be saved or sacrificed. In the world of books, fair use by itself is valueless: what is valuable is the culture of reviews and exchange and cascading scholarship and increased attention paid to books that fair use enables, and a similar goal-directed approach has something to offer to digital music.

Hatch's actions also put in perspective the real interests of the record labels. On the one hand, as much as the DCMA was supposed to get them to open up their vaults to the Internet, you do have to feel a little sorry for all the low-level executives who looked at the Internet, and saw, quite rightly, that the instant the catalog hit the web would be the last instant the label would ever have genuine control over the catalog ever again. They figured that they wouldn't be fired for stalling the process while they desparately tried to overcome what Larry Lessig calls the "architectural" constraints of the 'Net, whereas they might very well be fired if -- as seemed quite possible -- digital music distribution turned out to be the pirates' heaven it reeked of. The law -- the DCMA -- did what it could to help them along, but the technology wasn't really there. And now, what a horrible surprise, right when they were planning on going back to Capitol Hill to ask for a little more help from the law in closing those ugly truck-sized loopholes that Napsterian and Gnutellan programs create, to find out that Washington is looking to them to make the next step. Such a simple mistake to make: they thought that Washington was promising to back up their solution to digital copyright, and that the DCMA was the first fruit of that commitment. And Washington was promising to back up their solution, but by that "solution," Washington had something more specific in mind: the DCMA itself. Come on in, Hatch has been syaing, the water's fine and I won't let you drown, but I will start shooting at you if you stand around on shore much longer.

The DCMA, you see, is written to back up systems of content distribution and control with the force of the law, no matter how flimsy those systems are technically. The movie industry has taken the leap of faith: they went ahead and used CSS in order to assert "control" over DVD distribution, and now they're asking Washington (via the courts) to make good on its end of the bargain. Their situation is, admittedly, somewhat easier, just because high-quality movies are so much more gargantuan than medium-quality music, so the consequences of a cracked system are somewhat less catastrophic. But the record labels, I think, see very clearly a future in which the easy availability of ripped content forces them to be a value-add in some way other than the mere providing of otherwise unavailable content. Better indexing, more exhaustive backlists, higher-reliability servers, interoperability with portable devices -- all these have possibility, but they're all also much closer-to-the-edge games to play, ones with smaller margins for error and less to do with the music itself -- and also, I think, less room to play hitmaker and turn a 500,000 unit album into a million-unit one. Which is why Hatch's threat may be very close to what Doctor Love ordered . For smaller bands willing to forego some of the network effects and (highly fickle) publicity machines of the majors, the barriers to entry, once you get past the production phase (which technology has other ways of helping with), there gets to be less and less you actually need a record label for.

I'm talking off the top of my head here, with some of these speculations. But something important is going on here, so if you commit only one more act of websurfing today, surf on over to the article and think about what the consequences might be.

Concert Report


Friday saw Travis playing a gig at the Showbox, and saw me seeing said gig. Must be something about the Showbox's acoustics or the sound setup, but I was pretty impressed at the rumble-to-noise ratio. The bass -- especially during the set of opening act Leona Naess -- was a physical force, and the steady beat of the low drums flowed through you like living mechanical waves crashing against the shore. There's something really great about being bathed in music like that: it's hard to describe the feeling exactly, but I experienced at something akin to what I imagine goes on inside an ultrasound washer: the sound shakes the sediment loose from the walls of your veins, and you can feel tension and glum thoughts just flowing out of you with a bit of a tingling chill. I was just standing there, smiling and nodding and tapping one foot and not generally looking very into the concert, but I assure you I was loving it.

But, as I was saying, what made the experience so great was that the aural portion of the evening's entertainment was carried out at quite reasonable levels. At least where I was standing (on the main floor, middle of the crowd, off towards one side, which can be a death spot in terms of being overwhelmed by the bank of speakers pointed directly at you), the pain-in-ones-ears issue was a non-issue, and pretty much the entire evening went by without anyone maxing out their amplification, which made me very happy. Clipping is a pet peeve of mine: whatever kind of statement you want to be making at that many deciblels is your business, but once you start really seriously clipping, your statement becomes increasingly indistinguishable from anyone else's. There's a very good reason not to push your electronics quite to the limit: you start sounding like shite when you do.

Travis themselves have a decent stage presence. Their frontman, Fran Healy, delivered a really discombobulated "You're the greatest!" tribute to Seattle, mentioning Nivrana and Frasier, tearing off into a pitch-perfect cover of the Frasier theme song. He also spent the breaks between most songs taking towels offered him by stagehands and wearing them over his face. They're pretty good live; no major departures from the album versions of their songs, but they made me like a bunch of songs I hadn't been so keen on before. "Slide Show," in particular, really soars live. Healy had the crowd sing one of the verses of "Why Does it Always Rain on Me?", a task the crowd was just barely up to. For encores, they did covers of ". . . Baby One More Time" and "The Weight." The former still isn't a song I like, but I at least believe them when they say the cover is unironic; the latter is a great song, and Travis is just about the ideal band to cover it. It rocked.

It’s not Supposed to Make Sense: That’s Why They Call it Law


Dave relayed to me a story a couple days ago, a story originally told to him with the moral "why engineers hate law school." The professor comes into class, passes out the day's case, and takes the class through a long and detailed exegesis, putting together fact after fact and precedent after precedent to demonstrate why a particular piece of case law applies to another class of cases. The class goes home exhausted but enthused, marvelling at how straightforward and clean the legal process can be, starting to understand how the law proceeds, reasoning carefully through each step of a question and resolving it with the benefit of the evidence. The next day, the professor comes into class, passes out the exact same case, and proceeds to repeat the exercise, putting together a second ironclad argument to show that the piece of case law is entirely inapplicable. The point is that although engineers are seduced by the logical reasoning and apparent rigor of the legal process, they aren't so well-prepared to deal with the law's contradictions and its willingness to decide questions in demolition-derby fashion, seeing which of two arguments holds together better after repeated high-speed collisions. If you go into law expecting each datum to support one side or the other and looking for consistency, you're being set up for a massive disappointment.

I mention this in the context of the recent $145 billion verdict against the tobacco companies. As noted in most coverage of the trial, the long-time defense of the companies has recently started to fall apart. That defense took the following form: smoking isn't harmful and there's no evidence linking smoking to cancer and other diseases, so we're certainly not engaged in knowingly selling dangerous products. Besides which, smoking is harmful and everyone has known this forever, so people who smoke do so in full awarensss of the health consequences. We didn't do anything to you, and besides, it's your own fault. Self-contradictory, sure, but let's look at the counter-claim: smoking is incredibly harmful and you've known this for years and deliberately been selling a product that kills its users. But smoking hasn't really been known to be harmful, and we certainly had no idea it was so dangerous when we started smoking or we'd never have taken it up. But now it's too late, and you owe us lots of money since nobody knew what everybody knew.

To recap, public knowledge that smoking kills supports the tobacco companies (because people who smoke know the risks) and it supports the plantiffs (because the tobacco companies know the risks of the products they sell). Further, public ignorance of these risks supports the tobacco companies (how were they to know cigarettes were so dangerous) and it supports the plantiffs (how were they to know cigarettes were so dangerous). Part of this whole rhetorical mess is just that people have been playing both sides of the Jesuitical street for years and are now reaping what they have sown -- it's hard for the tobacco companies to run from years of publicly denying the health risks of smoking, but history of Surgeon General's warnings are a piece of inconvenient history for government lawsuits against the tobacco industry. It can be very hard to back away from your past arguments -- even if you don't subscribe to them any more -- just because whatever case law you've built up on your side depends on these arguments and trying to switch horses in midstream is a legally dicey strategy.

More to the point, we are holding the law to alien standards if we ask that each piece of data weigh in for one side and one side only. Each fact, each relevant piece of evidence, supports certain lines of reasoning and argues against other lines, emphasizes some factors and mitigates others, and there is no particular reason that the arguments it supports should belong exclusively to one side of a case. Cases wind up looking riotously contradictory because of the selecive filtering each side's briefs peform. Within a brief, it's quite reasonable put arguments in dilemma form: if the evidence is thusly, we are in the right because of so-and-so, but if the evidence instead says this other thing, we are still in the right because of so-and-so other reason. This is just a reasonable covering of all bases. And in comparing opposing briefs on a topic, each side will "conveniently" mention those aspects of a detail which supports their side while omitting those aspects which help the other side. This isn't strange or bizarre at all. Evidence that the defendant was in a highly emotional state may suggest that he was sufficiently bothered by the dispute to kill over it, but it also suggests that anything he did wasn't premeditated. But you'll never hear a trial lawyer stating both these implications in the same breath, for the reason that at trial, lawyers are advocates. It's the judge and jury's job to perform the counterbalancing. And once they've made thet balance and reached a decision, however, narrowly, well, then they're back in the position of trying to make that contested decision appear inevitable and incotrovertible. Think of Brown vs. Board of Education: that unanimous decision was, in part, a way for the Supreme Court to abrubptly reverse course without looking confused and legally shaky.

The recent case in which this whole playing-off-of-opposites has come up time and time again is the Microsoft case. Both sides are "guilty" of this selective amnesia, of this trying to have it both ways at once. High prices for Windows are evidence of monopoly power, but low prices for Internet Explorer are evidence of strongarm tactics to preserve that monopoly. Or is it that the high prices indicate that Microsoft is behaving like a normal business in trying to make money and that the low prices are evidence of intense competition in the browser arena? Windows and IE are too intertwined to break apart, but Microsoft used this tight integration to batter its competitors? Or is it that they're sufficiently separate to split the company up, because Microsoft was a good citizen and maintained a Chinese wall between its divisions? Netscape was a threat, wasn't a threat, and Microsoft killed it off, or didn't kill it off, and this means that IE was an inferior product or a superior one, all of which supports whose case, exactly? The Microsoft case would be a classic example of everyone involved playing off both sides against the middle, if only there were a middle available to point to.

My overall point is that courts of law are actually not a great place to go looking for intellectual consistency, nor should they necessarily be. Dave also pointed out the remarkable extent to which the law changes across the centuries, however much it may seem fixed and stable at any point in time. Law has to be prepared to give and take with the flow of society and custom and to make graceful retreats here and there. The gross inconsistency that the legal process engenders at trial is actually evidence of a well-functioning system, one that has room in it for contradictions and actually encourages differnces of opinion, is willing to listen to alternative points of view. A courtroom in which the accepted legal value of each datum of evidence is agreed to by all sides is a kangaroo courtroom: it's generally only at show trials that prosecution and defense agree on the relevant legal precedent. If you want to see a legal system functioning with fewer blatant contradictions, in which the participants adhere more carefully to a shared set of assumptions and interpetations, in which the meaning of evidence is crystal-clear -- well, then the burden falls upon you to distinguish your legal system from that of Stalinist Russia, where the courts never had any trouble interpreting evidence and arguing over precedent. The evidence always indicated guilt of treason, punishable by death or by exile, clearest thing in the world.

A Useful Corrective


Chase has a different opinion about uber.nu than I do. I'm not persuaded, but he makes a pretty good case. In particular, he draws some distinctions that I think I blurred -- between Uber's columnists, and the differences in the Uber attitudes towards various other web sites. I still think the sarcasm just flies off the screen over there, though: far more so than at your average website.

Three Modes


Went to hear Paulina Borsook read from her book Cyberselfish, about which more some other time, quite possibly after I've actually read it. During the question period, though, she got into a minor debate with an audience member on the topic of high-tech charity. Her position was that within the main influential hotbeds of high-tech libertarianism -- basically the Valley -- philanthropic activity is at a minimum and is not culturally an important priority. The interlocutor disagreed, citing a couple of reasonably well-known examples. Borsook's reply was that these were the same exact examples everyone who challenged her on this point trotted out, and that it was actually quite difficult to find examples beyond a particular small finite set. She maintained that these were exceptions, rather than the trend; he disagreed; she said that he was entirely entitled to hold that opinion, but she thought he was wrong. And left it at that.

This left a bit of a sour taste in my mouth, for reasons I couldn't entirely pin down. I realized only later that it wasn't the subject of the debate that bothered me, but its form. Borsook's final position -- which I think the guy in the audience basically agreed with -- was that this was a question that didn't admit of resolution based on the data points available to them, that they were entering the realm of opinion and interpretation where facts could not quite go. And this seems suspect on evidentiary principles. Historians reach this sort of conclusion all the time, that some broader social statement is or is not the case. If they aren't trapped by the example-or-exception epistemological quagmire that Borsook alluded to, why should she be? It took a bit of pacing and some further thought to come up with a better answer. Put succinctly, in the terminology I worked out in the car on the way back, she's a journalist, not a scholar.

I'd like to suggest that a great deal of human participation in intellectual debate can be assigned to one of three categories: art, journalism, or scholarship. I'm going to be Humpty-Dumpty-esque in this discussion; it's best to pretent that these three terms are random collections of syllables for which I'm providing definitions, rather than well-understood designatory nouns about which I'm making some kind of argument. Also, please understand that none of these categories is ever wholly pure, that they always exist in some sort of mixture. They're just three competing and mutually-exclusive goals that any expression must answer to, never wholly satisfying or ignoring any.

Art is that which answers to some set of intangible standards, which follows first and foremost the demands of the aesthetic. It may bear some relationship to reality, may represent or misrepresent actual people and events, but the point is that the primary standards by which it is created and judged do not include accuracy of representation. Fiction is art, but so is synthetic philosophy, so is theology. When push comes to shove, it's adherence to some abstract code that makes for art. To the artist, truth exists apart from reality.

Journalism is that which represents reality faithfully and specifically. The perfect journalist is invisible and voiceless, presenting an absolutely faithful rendition of some aspect of reality for the reader, or observer, or listener. The journalist gathers data but does not interpret it. Photos from the war zone are journalism, but so is socialist realism, so is biography, so is law. The point of journalism is to be completely accurate in showing the details of whatever it takes for its subject. To the journalist, reality is truth.

Scholarship is that which seeks an accurate description of reality in full generality. The scholar wishes to elide details in the interests of capturing the common summary that explains every detail. The ideal scholar reduces an immensely complicated situation to a scucinct representation that interprets and makes sense of that situation. Physics is scholarship, but so is history, so is psychology. Scholarship answers to reality, but it demands for itself the right to describe reality in terms other than reality's own. To the scholar, truth lies within reality and beneath it.

Any of these three positions, taken alone, is grotesque. But it is definitely possible to situate methodologies (and the people who employ these methodologies) closer to one or another of these poles. Borsook, it is fairly clear to me, is a journalist -- and so was her interlocutor. She goes to events and reports on them; her quest is for the perfect set of details which make an ironclad case for her conclusions. To a scholar, this method of reasoning is repugnant: it amounts to jumping abductively from one example to another. But I'm not sure that a scholarly approach to this problem -- trying to figure out aggregate measures for chracterizing the philanthropy of a social group and to develop instruments for taking those measures -- would satisfy the Borsooks of this world. They could point out, fairly legitimately, that computing the total amount of money given to charity per capita per year, say, is a perfectly useless way of trying to speak about the attitudes of people, of whether they are selfish or not. Go to San Francisco, knock on the door of a VC firm, write down the words that spill from their mouths, the buzzwords and the jargon and the anti-government rants of the humvee drivers, and you will see in a way that no summary could possibly equal.

I think this strong journalistic slant of Borsook's also explains a bit of her hostility towards the bionomic movement, whose drive towards transcending the physical she calls "self-loathing." I quizzed her a bit about people who treat cyberspace as ontologically real, ala John Perry Barlow's Declaration of the Independence of Cyberspace, and she took much the same stance, calling them nuts. One way of looking at the matter is that, according to this tripartite logic, these people are artists, artists whose artistic ideals elevate particular ideas that have a strong connection to the mental aspects of reality, but only a very weak connection to its physical aspects. Mutual suspicion is the order of the day. Eric Raymond's recent attack on Borsook (and Michiko Kakutani, of the New York TImes), with its references to the high-minded ideals of Open Source. to "freedom itself," and to the principles underlying the "gift economy" is a prime example. Raymond's argument, like much of his writings on Open Source in general, is a familiar form of utopianism: start with some abstractly justified moral principles and work from there towards the wonderful real-world consequences that just happen to fall out of doing things in accordance with this idealized code of conduct. If one keeps in mind that art, most often, has some real-world embodiment, then Raymond is preaching the gospel of art, and reiterating the artist's "you just don't get it" complaint towards the journalist who notes the inedibility of marble sculpture.

It's a bit of a stretch, I admit. Even after typing up just this much, I can see daylight through some of the holes in the model. But I still like it as a rough characterization, and it ties together some other thoughts I've had (see, for example, "Cynicism and Journalism," "The Critic and the Journalist," and "Copenhagen Controversies," from my Slate page) about the varying natures of various intellectual endeavours. I'm interested to see what will come of shaking this particular tree some more.

Ideate: A Retraction


Nina, more literate than I, has confirmed the legitimacy of "ideate." The term has a respectable history in psychology, where it's a perfectly useful technical term with, it would appear, roughly the force of its dictionary meaning, "to form an idea," or, transitively, "to form an idea or conception of." In fact, the same dictionary check which supplied me with these definitions dates the word to 1610. This doesn't rule out the possibility that the term was first coined by shady j argonistic businessmen, but if that happens to be the case, it does put a bit of a different spin on matters. I don't think the word's long provenance necessarily excuses its use as a dot-com business buzzword, but "ideate" itself is innocent of the charges I levelled at it yesterday.

From the We are Not Making This Up Department


As relayed by Kyle Niedzwicki, it appears that the Chinese are deploying an army of ducks in order to deal with locust infestations. Xinjiang province, China's Wild West, is suffering from a severe locust infestation, the combination of a similar infestation last year in neighboring Kazakhstan with severe man-made ecological problems (see The Flying Desert and To Kill a Sparrow for more details). Having discovered that an individual duck will consume up to 400 locusts a day, the Chinese have trained large hordes -- 700,000 large, in fact -- of ducks to associate the sound of a whistle with feeding time, where locusts are on the menu. Like everything ever done by China since the Revolution -- posters, dams, armies, social upheavals, and national causes -- the story is amazing for its sheer scale. Click on the link to see the picture, if nothing else. Readers of The House that Jack Built or attendees of any Passover seder will be on the lookout for the inevitable next phase, in which it becomes necessary to import large packs of wild dingoes to deal with the duck infestation. After the dingoes, mountain yaks, to be followed by ferrets, rhinoceri, the People's Liberation Army, and the Angel of Death.

Look on My CSS, Ye Mighty, and Despair


Cleaned up the CSS a bunch, to the point where I almost understand the spacing algorithm I'm using. Not that I actually felt like taking the risk of tampering with it to try and regularize some of the inconsistencies, but I was able to sweep away a lot of the unreferenced detritus. In the process, I caught a couple of bugs, including one in the display of BLOCKQUOTEs that I suspect may have had a hand in various Netscape crashes recently reported to me. I also pulled all of the color-control properties into one section of the stylesheet, to make it easier to manipulate that aspect of the page. Didn't abuse this power, though, other than to nudge the date color into something a bit more obtrusive and to add a bit of a greenish kick to links.

Hmmm. Abstruse, obtrusive. So, by analogy: abstract, obtractive. I like this one. I think I'm going to start using it in everyday conversation. Once, that is, I figure out what it sounds like it means. Came across a truly horrific coinage today: "ideate." I'm not sure precisely what the intended usage is, although I suspect that it's the mutant offspring of "idea" and "iterate." If it is, then I'd claim it's linguistically repulsive for at least two distinct reasons, one etymological and one phonetic. Dave and I were having a conversation about the language of Internet startups and industry players. After talking about how ridiculously easy it would be to create venture capital mad libs, Dave started rattling off a list of words companies use to describe their working environments, concluding with "oxygenated," a word that's not really in heavy rotation in the Internet corporate world, but really, by all rights, could be. That's what the tech industry is about, fundamentally: the conversion of meaningful technical terms into meaningless buzzwords. The process is exothermic, and by trapping the released heat, San Francisco plans to solve its power-generation problems well into the middle of the next century.

Of course, both ideate.com and oxygenate.com are taken, both by firms specializing in "communications strategies." Shrug shrug question mark.

My other thought for the day was the slogan for Mad Max: Beyond Wireless: "One hundred companies enter! One company leaves!"

A Bit More Context


I threw out a disparaging reference to Media Virus yesterday without proper grounding. While the following explanation is based on my unreliable memories of a pretty hasty reading from three years ago, it should make the point of my reference a little clearer. In that volume, Douglas Rushkoff uses an extended metaphor from immunology to talk about the spread of ideas through various media and the way in which "stories" acquire lives of their own, often against the desires of those who originally released those stories. The discussions about the ebb and flow of ideas are similar to memetic arguments, but with a very different slant, one that plays up the chaos and unpredictability in the spread of real-life virii. Rushkoff focuses on rumors, on fads, on subversive messages that stick in the brain and ride piggy-back on other, more officially-sanctioned messages. Rushkoff's focus is on "infection" as a methodology for social change.

What's to like? He has some very good examples of the spread of ideas, of the ways that they jump from one medium of transmission to another (although I forget the specific examples, the first half of the book is all examples, and I remember it as being prettygood). I liked also his willingness to discuss the interrelationship between different media and how these relationships affect the diffusion of thoughts along and between channels. And on a very high level, I agree very strongly with the basic methodology: look, analyze, understand, use. If you want to influence a major corporation, you need to be able to exploit various media, to take advantage of large and strange cultural forces to do your heavy lifting for you. Social change as jujitsu is a neat idea.

And what's not to like? Basically, that having said this much, Rushkoff then says nothing more of interest, but takes a long time not to say it. The second half of the book purports to be something of a primer on creating media virii for fun and non-profit (if my memory serves correctly), but is really quite short on the specifics. All of his suggestions are along the lines of "say something really clever in an untraceable medium in such a way that your clever saying will be propagated along with your adversary's message, wherever it goes." Which is perhaps a great goal, but perhaps not also something you can just go out and do. And, more importantly for my current thoughts, Rushkoff never really goes up against the self-reflective aspects of contemporary media, never properly discusses the perpetual-motion publicity machines the attention loops create. It's one thing to send out humorous fake press releases; it's something else to create a story which will continue to work to your benefit once it gets caught up in a vortex of attention, once the story itself becomes the story. These vortices are like IMP-loops in Corewars: they subsume the identity of anything which crosses their path -- the real challenge is to make that endless instruction loop branch off to an address of your choosing.

A Brief Descent into Meanspiritedness, but Not Without Purpose


Those folks over at uber.nu really bug me. The bitterness that fuels that site is almost beyond belief, the deep contempt they hold for everyone else on the web, especially everyone better-known than them. If the site has a message, this is it: we are just as funny and smart as various other people you probably know more about, but not only that, we're more cynical also! And since those other people are famous for their cynicism, let us be famous, too! Shower that fame, that money, that gratuitous sex upon us! And yet we are not famous, oh woe. Perhaps we are deep losers, as loserly as those we mock. Yes, we are stupid and unimaginative, but no more so than Suck and Feed and McSweeney's, see how brilliantly we lay bare the shabbiness of everyone, sparing not even ourselves. That part about the fame and the money and sex is just a joke, just a way of imitating their irritating style, we're the only ones beyond that foolishness. So why are those jerks getting all the attention?

Round and round it goes, the cycle of bitterness and loathing. Very Dostoyevskian, actually. In a simpler day, there would have been an easy dismissal for these folks: that they were self-obsessed parasites barnaclinging to the hulls of genuine cultural ships. That case can't be made, though, for the reason that the sites and personalities the Uberites are going after are themselves cultural lampreys, after a fashion, sites that specialize in a self-conscious tearing down of anything more well-established themselves. To be precise, the very reason that the Uberites are exempt from this criticism -- that their targets are also self-conscious media snipers, and therefore not on any higher plane of existence requiring ethical defense -- makes clear why this argument has always a bad one: McSweeney's and Suck are, by and by, worthwhile endeavours. There is room in the media landscape for these strange predators, picking off the weak and the elderly from the mass media herds, and the argument that Uber's very form renders it suspect falls apart when seen in this light.

Rather, I think, the problem is that McSweeney's and its ilk are doing something at least marginally interesting and useful and that Uber is just so much cultural navel lint because of this fact: the content of its criticism is identical to that which it criticizes, which makes the criticism itself pointless. Uber is not a value-add, there's nothing new to see here, just some slacker-ass whining. The self-deprecating ironic detachment-cum-ironic-mimesis that motivates the first-generation sites is something new; there is a break between imitator and imitated, something qualitatively different. This is Dave Eggers' contribution to modern arts and letters, I think: he brings an unsparing introspection to his own self-obsessed relationship to modern existence. There's a hall-of-mirrors empty quality to A Heartbreaking Work of Staggering Genius, yes, but this is also part of the point, he advances also the interesting -- and I think genuinely novel -- thesis that this self-obsessed desire for self-fulfilment through the reflected introspection that fame offers is the natural limiting tendency of life these days. The entire population of the world and their collective attention and overlapping connections ("the lattice" is the term Eggers uses) is to become the required mediating term in his self's knowledge of itself.

It's damn hard to add anything more to this line of analysis, and say something more in the same vein. The mcsweeneys.net site adds something new: it drives towards this synthesis, holding the ironic Self and the serious Other together by engaging in straight-faced parody, perfect imitation, reaching for every joke it can while refusing to admit that there is a joke. You learn something about humor, about the world, about style, about self-consciousness, by reading mcsweeneys.net. And among parodies of McSweeney's, the mcsweeneys.org site adds something new: it picks up on the self-annihilatory tendencies latent in the mcsweeneys.net site, the rhetorical gap between McSweeney's and the Eggers self-analysis, and it completes the McSweeneys's statement, pushes things forward to the point where authorship and identity come unbound, where the perfect sincerity that McSweeney's professes destroys itself utterly. By reading mcsweeneys.org, you realize something about mcsweeneys.net, about everything you learned or thought you learned from reading it.

So while it is possible to add something to this whole "discussion," as it were, Uber.nu adds nothing, other than, perhaps, a dash of the color red. If you rip into a regular celebrity in this way, you perhaps are saying something interesting, you are bringing outside perspective and self-awareness to a situation perhaps devoid of useful introspection. If you rip into Dave Eggers in this way, you are saying nothing interesting, because Eggers himself has already said it about himself. And this is ultimately the problem: it's more fun to read Eggers than it is to read the Uberites, whether the subject is the world at large or Eggers himself. He's a better writer, more perceptive, more articulate, and fundamentally, far far more original. Say whatever else about Eggers you will, it's more than just an issue of relative fame that Eggers doesn't need to write about the Uber.nu people in order to find something interesting to say -- whereas they can't even find something to say when they write about Eggers.

More generally, this is an important issue, and one I'm trying to put together a new set of thoughts on: what is the right way forward. There is no way to make sense of modern media, of modern culture, of modern politics or economics or society or anything, which does not acknowledge and deal with issues of irony and sincerity, of self-consciousness and parody, of the strange dynamics of self-affecting systems and the uneasy interplay between positive and negative feedback. My old thoughts on the matter, while still close to my core of belief, aren't so useful in the new context of the world. People like Eggers and House of Leaves author Mark Z. Danielewski and Weird Like Us author Ann Powers are so redefining the landscape that a lot of my old thoughts about transcendence in the small through an embrace of irony seem, well, sort of outdated.

My latest thinking -- and this is all extremely tentative, let me say up front -- is that now is the time, more than ever, for extremely clear-headed thinking. Not because we need to embrace simplicity and repudiate irony (sorry, David Foster Wallace, I still don't buy your claims); there is no way back, never has been. But rather, because we've brushing up against the perfect vanishing-points of Eggers-level ironic introspection, I think we're as close to the black hole as we need to get in order to pull off this gravitational slingshot. It's time to start paying very close attention to the exact workings of this stuff, of the dynamics involved, of the levels of representation and self-representation. Because if we can sort out enough of the details -- and these things are, I'm discovering, almost painfully difficult to think through, due to all the loops and false loops and self-reference -- we can start coming up with rutters through these tricky shoals, start codifying and understanding the tactical use of irony, start trying to use these amazingly complicated but also amazingly powerful forces to accomplish that which needs to be accomplished. Douglas Rushkoff's Media Virus is the book I've read that comes closest to expressing what I'm thinking of, but his actual examples and recipes for action were orders of magnitude too simple. The point is that working to cause positive change in a system which is profoundly self-affecting in a thousand different ways will require extraordinarily subtle analysis, and that I think we're reaching the point where that analysis is both profoundly necessary and perhaps almost possible.

Further Potter Musings


Out of deference to readers who don't want plot spoilers, I'm sticking this off on a separate page. Also, in one of those great bump-set-spike deals the universe occasionally pulls, over at Slate, this week's Book Club is discussing, yes, surpise of surprises, Harry Potter and the Goblet of Fire. Jodi Kantor's opening entry, although by and large perceptive (and quite honest about the Potter reading experience), made a startling misstep near the end, in writing off the "moral lessons" of the series. This was actually an issue I'd been thinking about for a while (well, okay, since I read the book). I've added my thoughts to my mini-archive.

Chocolate-covered Puzzle Clusters


For those out there who like this sort of stuff, I'm putting up a link to some puzzles I recently wrote. Now that the competition they were for has been held, whatever trade secret intellectual property they were vested with has been nullified, and it's safe for me to open them up to public consumption. As puzzles go, these are sort of a light snack, pleasant but not especially filling. It's quite hard to make up really good puzzles, I'm discovering. On the one hand, it takes some kind of God-given creativity (a creativity I quite decidedly lack) just like any other creative endeavour. On the other hand, putting together a good crossword or a pictoral word search, say, takes a hell of a lot of work. The Set puzzle in this batch was a real bitch to put together -- two full evenings of staring at cards and methodically looking for sets -- and that was a walk in the park with what I know some other people have gone through to put together puzzles. And on the other hand, getting puzzle difficulty right is a real challenge, much larger than I realized. My admiration for the Kenny Youngs of this world, who can put together brutal combinatorial puzzles that have exactly 3 or 4 deep false trails (no deep false trails and the solution is entirely straightforward; more than a dozen and you're better off using a computer) has grown a lot from this, my first real run at designing puzzles.

Update, As of 7 AM


Holy fucking bejeezus. It's astouding. Harry Potter goes epic.

The Wizard has Landed


It's 12:45, and I've just returned from the bookstore, where I obtained my copy of the fourth Harry Potter book, Harry Potter and the Goblet of Fire. It wasn't quite a madhouse, but it was quite something. The line started at the cash registers, snaked around a bunch of shelves, cut a meandering way through most of the store, did some switchbacks in the cafe, and finally, in one final heroic spurt, made its way out through the other entrance to the store. There were kids of all ages (except the pre-literate ages) and a fair population of teens, both Harry Potter look-alikes with glasses and beefier types asking the employees whether the bookstore had copies of various tracts of WWF hero-worship. Mike and Louis and I got there around 11:25, in plenty of time to guarantee us copies, and spent the time until midnight hanging out and cracking various jokes about the situation with a fifteen-or-so-year-old fellow in front of us in line. Mike suggested that we grab extra hardcover copies of Angela's Ashes, slap on hastily-made color Xeroxes of the Harry Potter cover, and sell bootlegs outside, extracting a premium from those too harried to stand in line (or to check the insides of the books they buy). In keeping with the true spirit of cheap knockoffs the world over, the Fepsi sodas and the Lorex watches, I thought we should call ours "Harry Porter." Louis thought that it would be a pretty good joke if the book consisted of several hundred blank pages, followed by a mocking note from J.K. Rowling. We also wondered whether we should be looking out for mall pirates -- "Your Potter or your life!" -- looking to make a quick turnaround profit in the Harry Potter shortage we're anticipating over the next couple days. It was a zoo there, but it was a fun zoo -- the bookstore staff were going around wtih candies and offering juice and coffee, and the promise of immenent Potter meant, I guess, that all the kids were on amazingly good behavior. I wouldn't have thought you could put that many elementary s

choolers in a room together and not have any of them making a deafening racket for some reason or other.

The book itself is huge. It's as thick as my copies of 2 and 3 put together, and that's honest size -- same dimensions and font as the earlier entries in the series. It's a veritable tome. It's also green, which I'm having a little trouble adjusting to, but the thing's sheer weightiness is its most impressive feature. This is pretty damn neat, when you think about it: not only have America's kids gone wild over a book, they've gone wild over a 700-page book. Did you ever think to see this day? The scene at the bookstore was the sort of thing usually associated with World Series tickets and hotly-anticipated action movies, nd it was heartstirring to see the same frenzy -- albeit with a calmer, more family-oriented, friendlier vibe -- coming up in the book world. And how were America's kids dealing with that final half-hour wait? The teens behind us were reading a magazine -- and not in the "read a line, giggle about it for a minute, read another sentence" way either, they were actually reading it -- and all around kids were looking at the shelves carefully stocked with children's books in front of them, picking up books and looking at the back cover text, perusing the first chapter. About half the familes I noticed left with other books besides the Potter. It was such a heartening moment. And if the fourth book continues the geometrical increase in quality and maturity that the first three have demonstrated, we'll be witness to something even more astonishing: a literary event affecting millions of youths that lives up to the hype surrounding it. I really hope that the book justifies the wait, for the sake of the future readers of America and the world.

I knew, once I realized what would happen at midnight this past midnight, that I would have to be there, that I would need to be standing in line at the exact instant, that I'd need to be part of the mob mentality, to partake of the actual physical event. And I know now what my larger mission for the night is. I don't need to be anywhere until nine in the morning. It's time to start reading.

Good Things, Small Packages


Muriel Spark's The Abbess of Crewe, roughly the size and weight of a CD jewel box, and which I found for $3 in a used book store (the Fremont branch of the very estimable Twice Sold Tales) is a wonderful little gem of a book. Written in 1974, it's a Watergate novel, and somehow Spark came up with the inspired idea of making her Nixon figure an abbess. The book spirals around Watergate and its themes in a graceful manner, eventually converging a bit to history with a brilliant couple of connections in the last few pages. Spark's language is also wonderfully arch.

Winifrede, land of the midnight sun, looks at the Abbess, and presently that little sun, the disc of light and its aurora, appears in her brain like a miracle.

. . . nor was she present in the refectory at eleven for lunch, which comprised barley broth and and a perfectly nourishing and tasty, although uncommon, dish of something unnamed on toast, that something being in fact a cat-food by the name of Mew, bought cheaply and in bulk.

"What are scenarios?" says Winifrede.

"They are an art form," says the Abbess of Crewe, "based on facts. A good scenario is a garble. A bad one is a bungle. They need not be plausible, only hypnotic, like all good art."

Also in the extremely cool column, Dylan and Allie presented me with a private-printing copy of the complete collected Stick, a comic strip drawn by the one and only Jeremy Smith. Before this wonderful gift, I thought of Smith's work with some fondness -- it was usually one of the better inducements to picking up the paper in the morning -- although my memories of it were that it was somewhat uneven. The collected volume belies those memories, though. Dylan pointed that what the under-one-roof aspect of the compendium brings out one of Stick's salient features: the way that the plot lines would build in a lackadaisical way for weeks. At the time, a day or two of missed delivery would be enough to derail the plot, and the dribbling out really cut down on the sense of connection from one strip to the next. Reading them three-to-a-page, I'm cracking up now, because the strips work really well in teams. The running jokes -- jokes that took years to develop in the original run -- hang together now, and his gentle goofy alternate universe of stick figures and superintelligent chickens gets the extended treatment required for one's sense of humor to click into line with Smith's silly sensibilities. In Stick's words, "A man's gotta moonwalk when a man's gotta moonwalk."

Attention Must Be Paid


Sorry about the lengthy off-the-air session back there. One might think that with a four-day weekend, my attention to the Laboratorium would skyrocket, but apparently that's not the mechanism at work here. If pushed, I'd guess that what's actually taking place is that the Laboratorium is the means I use to deal with mental churn, and that when the external stimuli are cut back, the safety valve doesn't need to be opened as much. Part of that unused Laboratorium energy has gone into postings to the Fray over at Slate. I've collected those musings into a mini-archive. I've also been working on a bunch of puzzles, but since the competition they're to be used for hasn't happened yet, I can't actually reveal them.