It's the beginning of a series about reading legal writing, targeted at your typical coder. My big thesis is that legal writing is in some ways a programming language and that the skills involved in learning legal writing are similar to those involved in learning to code. Check it out, yeah?
One interesting feature of the protest is that the rally beforehand was held on the town green, which is privately owned. Usually, private ownership of "public" space means trouble, because the rent-a-cops who lock down these spaces don't answer to the same Bill of Rights as their government-issue counterparts. But in this case, the green is overseen by a trust whose members are more in sympathy with the unions than with management. As a result, they sometimes approve events the town wishes they'd turn down and they provide a limited degree of sanctuary for people who want to express their grievances with the university. You know, it's a union town.
Anyway, at this rally, I was doing my absolute best not to get arrested, since being arrested cuts down your credibility as a legal observer. Of course, I wasn't really needed as an observer, first because the police were so down with the whole union solidarity thing, and second because the streets were lined with other folks there to show additional solidarity of the not-being-arrested kind. So my role was really mostly limited to wearing my "Legal Observer" T-shirt and giving people who asked the phone number of an organizer.
So there I am, participating in the lining and the solidarity and the not-getting-arrested and the hey hey, when the Lubavitchers roll up.
Usual note up front: if you don't recognize a noun, Google it.
As I was saying, the Lubavitchers roll up, in their suits and other trappings of ultra-orthodoxy, and start going up and down the lines of people on each side of the street. It's Sukkos, so they're carrying palm trees and esrogs, which make for decent conversation starters. If the exotic foliage and obscure fruit don't work, they drop back to the more prosaic but direct technique of staring you full in the face and asking "are you Jewish?" I notice that they don't ask this question of the schvartzes.
One of them, a bit shorter than me and disturbingly serious of mien pops me the question. I don't hesitate at all in saying "no." He walks on, and only later do I realize what I've said.
I didn't fast on Yom Kippur this year, for the first time since I achieved my majority. I didn't have a big meal with apples and honey for the new year; I didn't even think about going to services. This time last year, I was more actively Jewish than ever before in my life; this year nothing.
It's always been cultural. I'm not religious in any meaningful sense; I'm probably your classic unbeliever. My mother has told me that the only kind of book that I showed utterly no interest in as a child was Bible stories. I get uncomfortable around displays of religious enthusiasm; I feel distinctly out of place in houses of worship.
Still, of all the faiths of which I am not a member, Judaism has always been the faith of which I would be a member, were it not for the whole atheism thing. I have enough Jews in my family that the Nazis would have come for me; the Jewish calendar has always been a steady background hum in my life. As I've said, call it cultural. Being a self-identified urban New York intellectual has a certain resonance with modern American Judaism; I have a sense of solidarity with other Jews, a sense that when pressed, I'd line up on their side for the game of kickball. Besides, some of my best friends are Jewish.
You turn to such things in times of trouble, perhaps. After last September, I didn't find faith -- if anything, I recoiled from what faith led people to do -- but I did grab tighter hold of that sense of community. My housemates at the time helped, Sam Sr. -- that remarkable, imperious, generous, manipulative, brilliant contradiction of a man and source of endless surprises of Jewish lore -- in particular. And the symbolism of the holidays spoke to me, too: I wasn't about to pray as part of my atonement, but I could see that atonement was something I and the world desparately needed. I didn't carry around an esrog, but I wouldn't have lied to someone who did.
No more. If, as I've thought, the only true basis of my Jewish self-identification was cultural solidarity, the events of the last year have utterly wrecked my sense of solidarity. Blame Sharon, blame the people who keep him in power, blame all the Israeli Jews who've closed ranks to protect a specifically Jewish national identity, blame the American Jews who'd rather shut down civilized discourse than countenance criticism of the barbarity their cultural solidarity would excuse. This was never what I meant by thinking of myself as a Jew, but the concept is polluted now, and it's not one I'm in any position to reclaim.
Don't get me wrong. I hate what the Palestinians have been doing, too. The whole of the Holy Land is filled with people acting in bad faith and with murderous motives. Suicide bombing is as wholly unacceptable as the murderous reprisals it incites. But I am not and have never been a Palestinian; there is nothing there for me to renounce or walk away from. Somewhere in the last year, I left my Jewish identity on the floor of a large and empty room, a room I am not certain I could ever find again.
It's kind of funny; I went through a phase of mild guilt over being blond and blue-eyed. It resonated with me that people who looked like me had done horrible things in the name of looking like that. But my body's been fixing that one; my hair is darkening with age. Cultural identity works the same way, perhaps: at some point when I wasn't paying attention, I gave up on Judaism and started seeking my community elsewhere. It wasn't something I decided about; it just happened. It's in the nature of such things that this fact makes the decision truer, I think.
This month, when people wished me a happy new year, I didn't admit to them that I'd forgotten it was that time of year again. Something died and disappeared, and it wasn't until I said that it didn't live here any more that I even thought about its absence. I know that the loss of a vauge and ill-defined sense of cultural adherence to a religious tradition is hardly to be counted among the the tragedies of this last year, but I feel the loss nonetheless.
Next year nowhere.
No amount of human experience -- not even the experience of having been repeatedly and utterly convinced that a mood would last forever -- seems to suffice to overturn this belief.
I was browsing through a book I had no intention of buying when one of the little paper/metallic security inserts dropped out. Rather than let it lie where it fell (this phrase is a legal allusion, so subtle and yet also so lame that I feel compelled to point it out as going far beyond my usual standards for lame allusions), I pocketed it. As expected, I set off the alarm bleeper on my way out, although empty-handed. I looked around in feigned confusion, waited for the alarm to stop, and then walked back into the store with a puzzled look on my face, thereby setting off the alarm a second time.
Now, if one person pulls this stunt, he's an annoying butthead with nothing better to do than annoy everyone else in the bookstore. But I don't want to be an annoying butthead, so I was thinking that if two people pull this stunt, well, okay, they're both annoying buttheads. But, you know, if a million people pull this stunt, then we're talking about a whole movement, and maybe they'll stop dropping insert tags into books. I figure that setting off the alarm while empty-handed is the exact inverse of shoplifting, and a much better way of poking back at the surveillance system the tags embody.
Anyway, it's still in my pocket, and at this point, I'm curious to see what others stores' security systems I can set off.
So I decided to do something about this problem. I went to the Kinko's downtown, slapped a casebook down on the counter, and told the guy it was too heavy and I'd like to do something about it.
He was skeptical. Copyright, he said; I can't just copy it for you. No, no, no, I replied, I want you to leave the pages alone. I'd just like you to rip off the cover and rebind it in smaller units. It took a few hours, but for $25 I converted my casebooks into volumes that could be mistaken for good old reliable paperbacks.
I'm expecting Peter down at Kinko's to have maybe a hundred people coming in over the next few weeks asking for something similar. This is going to be huge.
- J.S. and the Special Demurrer
- The Hairy Hand
- In Contempt
- Warrant (whoops, that one's taken already)
- Black Acre
- Heirs of the Body
- Supremacy Clause
- In Terrorem
- Habeas Corpse
Because they had prece-dental value!
So far as I know, homo sapiens is unique in having a choice of positions; dogs, for example, have only one, eponymous, option. Is this flexibility an evolutionary feature that plays a role in motivating us towards reproduction? Or is it a spandrel, an accident of anatomy evolved for other purposes? If so, is there a �natural� coupling configuration to which it is only our unruly brains have devised alternatives? Or is my premise simply incorrect, in which case which other species go at it in all sorts of ways?
No, I don't mean that he'll be to popular music in this decade what Dylan was to music in the 60s. No one could; the cultural economy of "popular music" has changed in ways that fairly effectively bar a repeat performance.
Nor do I think that Eminem is Dylan's musical heir. Trying to trace a line of influences from the one to the other is an exercise in radical attenuation.
Instead, I'd like to suggest that Eminem has latched on to Dylan's personality: he may well be the best modern exemplar of the Trickster figure that Dylan embodied back when, well, back when he was still Dylan.
Eminem is -- as was Dylan in his day -- a lyrical prodigy who is nonetheless jaw-droppingly inarticulate when not actually performing. He has Dylan's dismissive sneer, Dylan's corrosive cynicism, Dylan's casual, pervasive misogyny. Dylan has always had a talent for mockery, for turning the character flaws of others into art ("Like a Rolling Stone," anyone?); this talent has sometimes seemed inseparable from his ability to befoul the lives of those who make the mistake of loving him.
Would the original Slim Shady please stand up?
Both articles, after all, are about the art of media manipulation. Bumble Ward spends her day on the phone lying to reporters, parcelling out interviews, and generally trying to impose her clients' very specific agendas on the press, and by extension, the celebrity-obsessed publiuc. Likewise, Pee Dee Dubya's over-the-top persona is not so much a creation of the media as a creation devised for the media: he's been lying in wait for years and is found precisely the right moment to strike to get the pandering coverage of his antics that he wants.
Friend and McGrath's articles might charitably be described as "deadpan," but the pan in question gives every indication of never having been alive to begin with. There's no sense of shock, no sense of awed horror, no sense of amused titillation, nothing at all. Reporters writing about the systematic hoodwinking of their colleauges might be expected to show a little more, well, professional interest, but questions of honesty, of responsibility, even of curiosity about the gravity-defying illogic of content-free sports and entertainment "journalism" are wholly absent from these articles. This is the most boring behind-the-scenes tour ever.
I also meant for it to be about three paragraphs long and take about twenty minutes. I was behind on my schedule of assigned reading for the night when I started; I have now updated my status indicator from "hosed" to "soooo hosed."
Hey! We'll have none of that in here! I am talking to you. Yes, you, in the peanut gallery. I'm not surprised to hear that you think any meta-theory about constitutional interpretation at all is too much meta-theory about constitutional interpretation. And after that little outburst, I think we all know what you think on the matter. Thank you ever so much for sharing.
As I was saying, something about the constitutional theory in the tradition exemplified by the assignments to which I've been exposed bothers me. It passes too quickly over the question of the binding value of a document two hundred years old. The appeal to democratic values is a good one, but that appeal is an appeal to our present respect for democracy and the values in protects, not to the historical respect for democracy embodied in the American Constitution.
Put another way, the argument from democratic principles can't handle the critique that if we can find a better way of being more democratic we ought to follow it, no matter what the Constitution itself says on the matter. Why should a genuinely democratic consensus be stifled merely because it can't muster the heightened level of support required to pass and ratify a constitutional amendment?
This isn't a real question. I'm passing over some good pragmatic arguments for respecting prior consensus; I want merely to insist that pure democratic theory has a serious problem with the continuted validity of democratic consensus across time.
There's a similar problem in the philosophy of personal identity: denying yourself pleasure right now in the interest of greater pleasure later only makes sense if "you" will be somehow the "same" person in the future. The identity seems obvious at first, but the gap can be surprisingly hard to close. What if you knew that you would suffer total amnesia between now and then, so that you'd have no recollection of your earlier sacrifice? What if you could impose intense pain on yourself ten years ago in exchange for a cinnamon jellybean right now?
Douglas Adams got here first, but the philosophic problem is real. Derek Parfit put it particularly well: pretty much any argument that you can make about your obligations (or lack thereof) to other people can be turned into an argument about your obligations (or lack thereof) to your "future self" or your "past self." It would be great to be able to make promises binding on your future self (economists can explain why that is, precisely), but, gosh golly gee, your past self was a real bastard making all those binding promises way back when? Why couldn't he have gone to the gym a bit more often instead of playing so much Nintendo? Selfish bastard.
Try another variation: sure, you thought it would be a great idea to get a tattoo when you were eighteen, but now that you're forty-three, that smiley face in bell bottoms sure looks dumb. And you know that your 1977 self was utterly convinced that smiley faces in bell bottoms were the ultimate in fashion and would never go out of style, but now that you know better, why shouldn't you go see Dr. Zizmor for that laser tattoo removal?
This same temporal argument cuts against the Constitution. If we're honestly convinced that we understand an issue better than the Framers, why on earth should we be bound by their imperfect understandings? (Remember: this question is meant for rhetorical purposes only; use for any other purposes will void the warranty.)
The professor today said something very sharp about the decision in Marbury v. Madison.
We interrupt these babblings for a brief message from our historical sponsors. The issue at stake in Marbury was a commission (as a D.C. justice of the peace) made out to William Marbury by the lame-duck Adams administration in 1801. Marbury's commission got lost in the last-minute scramble (the Federalist White House staffers were too busy removing the quill pens from the desks, one supposes), and the incoming Jefferson administration refused to deliver it or recognize Marbury's office. Marbury sued directly in the Supreme Court to make them fork it over, although the case didn't come up until 1803 (some things never change). The opinion, a political masterpiece by the newly-appointed Federalist Chief Justice John Marshall, refused -- on a convoluted technicality -- to order (to "issue a writ of mandamus" in fancy lawyer-speak) the Jefferson administration to fork over the commission. The decision is famous because Marshall, in order to come up with that technicality, more or less gave us our modern doctrine of judicial review, in which the Supreme Court can just chuck acts of Congress on the waste heap if it feels like it (and can wrangle the Constitution into saying that the waste heap might be appropriate for this particular bill).
Right. The professor pointed out that the opinion in Marbury v. Madison basically said that Marbury's commission was just a piece of paper. The metaphysical reality of Marbury's right to the office, such as it might have been, had nothing to do with the piece of paper and which rug it was or was not swept under. Similarly, in saying that Marbury had a right to it, but not a right the Supreme Court would make anyone do anything about, the opinion itself openly admitted that it was just a piece of paper, too. Maybe some lower courts would treat its holdings with respect, but the opinion itself had no mystical power to make anyone do anything.
This was as far as he went. But the light in my head went on when he said it, and I'm willing to go further. Even if Marbury v. Madison had ordered Madison to engage in the requisite forking over, it would have remained a piece of paper. Madison's blatant refusal to fork anything over, under, through, or around would have made the opinion's paper-thin authority quite literally evident.
Even in the doubly counterfactual scenario in which Marshall ordered Madison to fork and Madison complied in said forking, the judicial opinion would have remained a piece of paper. Its persuasive weight -- its ability make someone do something -- would have derived from the power struggles and ideologies swirling around it, not from anything intrinsic to the paper itself, or even the words imprinted on it. Power struggles and ideologies are related to the papers around which they swirl -- and the nature of that relationship is both an interesting and a critically important question -- but in no truly meaningful sense can the papers be said to be the cause of the struggles around them, any more than the eye can be said to be the cause of the storm.
All opinions are paper, all commissions too. Money is paper with numbers and pictures of dead guys; the Constitution is an old piece of paper with a bunch of signatures at the bottom.
So what does this have to do with my original point, the one about identity across time and why we should care what a bunch of guys in funny wigs and short pants thought? Well, I feel, in a way that I can't yet perfectly articulate, that, from the perspective of theories of constitutional interpretation, the identity across time problem and the piece of paper problem are the same problem.
Put another way, a theory that runs into trouble on one of them will run into similar trouble on the other. They both take aim at the connection between "here" and "there." "There" may be two hundred years ago, or it may be the abstract Platonic realm in which words and documents live, but in each case, you'll have to do some heavy theoretical lifting if you want to connect "there" to the here-and-now in which people live, think, talk with each other, do complicated and messy and not very well thought-through things with their lives, and generally muck about being, well, people.
Remember those rhetorical questions? You can unleash your answers now. I don't really know what my answers are, but I think they have something to do with those complicated social interactions. Legitimate democracy flows from these interactions. More than that, even: wherever human society is present in sufficient complexity, something worth calling "democracy" is there too, even if it's in disguise and keeping an eye out for trouble. These complicated understandings, from which something even occasionally emerges that might even be a consensus -- precisely because they touch on everything that people think and do, precisely because they exist and perpetuate themselves even beyond the extent of the participation of any one individual -- have the potential to stretch out, both into the past and into abstraction, and to do things there.
Maybe the gaps I've identified aren't really bridgeable. But I can speak about them as gaps, I can at least put in words the idea of bridging them, even if that underlying idea is incoherent. Incoherent or not, I can speak about it, and so can you, and so can anyone else who wants to play the constitutional interpretation game. The ideas may be perfectly incoherent the whole time, but they may still work, because the arguments we have over the interpretation of incoherent ideas have enough in common with the arguments we have over the Constitution itself, over laws and values and ethics, over what our government ought to do and what it will do.
The argument is the source of meaning, to which both the Constitution and the arguments over the Constitution's meaning refer back. People are machines for finding patterns, even where no patterns exist; political theory and constitutional interpretation are just two putative patterns for the same conversation about people and how they treat each other.
Even a stopped clock is right twice a day.
I heard him out, then explained that I could go into the library at school and read the paper for free there.
Even on weekends, he asked?
Even on weekends, I said.
And that was that.
In 1934 he [William Langer, Governor of North Dakota -JG] was indicted for conspiring to interfere with the enforcement of federal law by illegally soliciting political contributions from federal employees, and suit was filed in the State Supreme Court to remove him from office. While that suit was pending, he called the State Legislature into special session. When it became clear that the court would order his ouster, he signed a Declaration of Independence, invoked martial law, and called out the National Guard. Nonetheless, when his own officers refused to recognize him as the legal head of state, he left office in July 1934. As with Adam Clayton Powell, however, the people of the State still wanted him. In 1937 they re-elected him Governor and, in 1940, they sent him to the United States Senate.
The quotation is from Justice Douglas's concurring opinion in Powell v. McCormack. The House of Representatives, on misappropriation of funds grounds, had refused to seat Powell, although it was undisputed that he was the legitimate winner of his district's 1966 election to the 90th Congress.
Powell sued, although the Supreme Court didn't hear his case until 1969. In the meantime, Powell had won the 1968 Congressional election and taken his seat in the 91st Congress. Langer shows up in the Powell opinion because the Senate debated for over a year whether to seat him, ultimately deciding that it lacked the power to exclude him.
He offered me no monthly fee and five cents a minute.
I told him that my current plan has no monthly fee and costs me three cents a minute.
After the briefest of pauses, he said that he couldn't beat that rate, thanked me for my time, and rung off.
Thirty seconds, start to finish.
Too bad that the other end of the power cord, the end that plugs into the rice cooker itself, is perfectly symmetric..
I don't get much competition for the chairs; I appear to be the only person here who has grasped the idea that you can put a laptop in your lap. The tables and desks are all at ADA and fire code compliant heights, a good four or five inches above the ergonomically advisable height for an average typist. So, while most of the others in here are at tables further down the room, I'm happily settled in an easy chair right near the center of the reference books.
Okay, so the whole reading room is lined with reference books. But the indices are in the center. Makes sense, really. The indices are way stations on the journey from one serially-numbered volume to another; the place you wind up going every time your citation-check needs to run forward, rather than backward. Thus, people on citation chases tend to follow predictable paths that lead them past my throne -- another reason I choose to sit there. Things are just more sociable, even if everyone passing by has a distracted look about the corner of the eye.
The procedure right now is pretty standard. Professors are assinging the same introduction-to-research assignments to entire classes. The result is that a fair number of people will come through, looking for the same particular data, pulling the same particular books from the shelves, going through the same mental gyrations as they figure out how the numbering schemes and citation indices work. On a given day, a steday trickle of folks will pass my chair, each coming from the same direction and going to the same shelf. When they arrive, they pull down the same book, open to the same page, and give the same grunt of satisfaction as they scribble down some numbers. Then they move off, headed for their next checkpoint.
Orienteering, plain and simple.
First, there was Dance Freak, a rhythm game in the Para Para Dancer tradition. Each player stands in front of a pair of motion sensors; at appropriate times, cued by the usual little discs rising up the screen, one sticks out a hand to "break" one of the invisible beams. Dance Freak is distinguished from other such games (at least in my experience) by having sensors both above and below: if the disc is blue, you put your hand over the sensors, but if the disc is red, you put your hand underneath.
I guess you could string together fancy display routines, the way good DDR players can, but a few structural features of Dance Freak would argue against its potential to hit the big time. First, standing there moving just your hands feels more akin to boxing than anything musical. And second, the sensors are set insanely low: to get my hands under them, I had to bring my elbows to my sides. Any dance move you can do with your elbows touching your waist is lame. I suspect that this game was built for Korean youth who haven't hit their growth spurt.
And then there was Landing High Japan. You get to be the pilot, bringing passenger jets in for landing (the game includes takeoff segments, too, but the takeoffs are pretty easy). Sounds like fun, yeah?
Well, there's at least one problem. Even the "easy" game setting still requires you to work the rudder pedals in addition to the stick. For fascinating physical reasons, the relationship between a plane's bank (controlled by the stick), its yaw (controlled by the rudder), and its rate of turn (what you need to affect to fly where you want to) can be quite non-intuitive.
Pilots spend a long time learning to rewire their intuition. In addition to careful insttruction in the physics of flight and how to execute smooth turns (instruction utterly lacking from Landing High Japan), they practice their "non-coordinated flight" (i.e., with an independent rudder) at high altitutes, well away from the complex pressures of landing. Only once they know how the plane will respond are they at the controls anywhere near the ground.
That is to say, any fool can bring the simulated 747 in for a straight-in landing (and this fool did), but if you're required to bank and turn as part of the approach, your 747 is likely to make the acquaintance of the ground (as mine did). I didn't see anyone else even remotely interested in playing Landing High Japan: I could easily imagine the machine eating quarter after quarter as a would-be player grappled with that merciless learning curve.
Then again, in a gaming culture just a little more obsessive than the one at a suburban American mall, such could be the recipe for an enormous hit.
Running men out of town on a rail is at least as much an American tradition as declaring unalienable rights.
-- Garry Wills
Apparently, they don't get this particular combination very often.
Such an opposition, I think, is neither obvious nor fundamental; saying that it is is nothing more than a rhetorical gimmick to shift our attention away from the magician's sleeve at precisely the moment our civil liberties vanish, perhaps never to be seen again.
In Article III of the Articles of Confederation -- version 1.0 of the U.S. Constitution -- one finds the lovely phrase "security of their liberties. By the second version, feature-creep turned this nugget into the more familiar (and inspiring) "secure the Blessings of Liberty to ourselves and our Posterity."
"Security of their liberties" may not ring as proudly as its successor, but its ambiguity gives it a beauty all its own. On the one hand, there is the surface meaning: that liberties stand in need of protection -- indeed, that protecting liberty is the true point of "security." But that ambiguous "of" goes further; the phrase can also be read to suggest that liberties confer security.
Both ideas, of course, are all over the place in the political discourse of the Revolution and the early American republic. And good ideas never go out of style, I should hope.
The scene that has bothered me since I first saw the movie, so many years ago, is the one that kills off all the good guys save only the two above-the-line A-list stars, who, of course, must live to save the day. A supposedly elite team of S.E.A.L.s has managed to get itself surrounded by bad guys with big guns in a room with almost no cover. They are asked to surrender, but their commander refuses, shouting back that "[he] cannot give that order." The predictable firefight predictably ensues, with the predictable outcome: the shootout is a shutout.
In the Simpson-Bruckheimer parallel universe, this is supposed to go down as an act of courageous sacrifice, stalwart military men standing true until the very end. Me, I think the situation calls for a posthumous court-martial. Commanders are responsible for the safety of their soldiers, including the obligation not to put them in harm's way if no good can come of it. The only military goal served by not ordering a surrender in this particular situation was forcing the enemy to use up some of their ammunition.
I would like to think that the real Navy doesn't teach its commanders to throw away lives on obscure points of honor. To quote no less a battle-crazed madman than George S. Patton:
Now I want you to remember that no bastard ever won a war by dying for his country. He won it by making the other poor dumb bastard die for his country.
But, of course, The Rock isn't a movie about the real Navy. It's a movie about a boyish fantasy world in which testosterone reigns supreme, cooler heads never prevail, and no real man ever passes up a chance at a pissing contest. In every Bruckheimer movie supposedly about the great and crushing responsibilities borne by America's armed forces -- from Top Gun to Crimson Tide -- we find self-righteous alleged heroes willing to subordinate this allegedly sacred mission to their own need to be the top dog on the scrap heap.
The military plays along with these war games; they rent the fancy toys and fact-check the authentic-sounding military lingo. Good for business, or so they say; you should see what it does for recruiting. But the true price of these circuses is a particularly galling form of cultural amnesia: a public that thinks military life ought to be like this: all macho, all the time. On which belief, I think, it's not unreasonable to pin some of this country's current war mania.
We've been force-fed images of men in uniform acting as jousting bull moose would, to the point at which images contrary to this overall picture simply fail to register. Saving Private Ryan was a profoundly anti-war movie, but four short years later, who remembers this inconvenient detail about the movie that set the modern stanard for intensity of on-screen carnage?