Saturday, December 31, 2005

It's time for ICBW's annual prediction review and update. First, last year's predictions....
  • There will be a major effort to rejuvenate the Middle East peace process, following the upcoming Palestinian Authority elections. The effort will come to absolutely nothing, and attempted Palestinian terrorist attacks and Israeli military incursions into Palestinian towns will be roughly as frequent at the end of 2005 as they are today.
  • I'll score myself "correct" on this one. Although I obviously didn't expect the Gaza withdrawal, it doesn't appear to have had much effect.
  • The coming election in Iraq will completely change the characterization--though not the actual character--of the American involvement there. A new Iraqi government, dominated by the Shiite slate, will assume power, and invite the US military to remain to continue its reconstitution and training of the Iraqi army and security forces, and to help protect the country from "insurgent groups", who will increasingly be labeled as foreign-supported and even foreign-staffed. Syria will be the main target of this rhetoric, because it's much weaker than Iran. But no action beyond diplomatic and economic sanctions will be taken, partly because the US is loath to expand its military burden, and partly because maintaining the Syrian fig leaf will be too useful for both the US and the new Iraqi government as an excuse to support the continued presence of US troops.
  • Mostly right, but the "Syrian fig leaf" was largely unnecessary, in retrospect, as blaming the local Al Qaeda forces was enough of an excuse for the Americans to remain.
  • Messrs. Chavez, Khamenei and Kim will remain in power yet another year, barring natural or accidental death. Likewise, Kofi Annan will survive the oil-for-food imbroglio and remain UN Secretary General.
  • An easy gimme, as it is every year.
  • The US stock markets will decline overall in 2005. The real estate market will (finally) peak and decline. Inflation will be kept under control by lower oil prices and rising interest rates, but the dollar will continue to fall--though somewhat less precipitously--and economic growth will slow accordingly as interest rates, the drop in the markets, and the ripple effects of the slowdown in China all take their toll.
  • This one wasn't exactly spot on, I'm afraid. The main stock market indices closed virtually unchanged from the beginning of the year. The real estate market has indeed shown signs of peaking late this past year--almost exactly at the moment when I bought into it, in fact (friends can confirm that I privately predicted that outcome). Likewise, oil prices peaked in the third quarter and have declined since, easing inflation fears. However, both markets unquestionably ended 2005 above their final 2004 levels. Inflation was, indeed, kept under control, but the US dollar staged a stunning rally throughout the year, and economic growth remained strong both in the US and China.
  • An event will occur in 2005 that will blossom into a huge, crippling scandal for US president Bush--though possibly not until 2006.
  • I don't think the various Iraq- or surveillance-related scandals qualify--overzealous conduct of the War on Terror is what the public expects from the Bush administration, and he's therefore unlikely to be blamed for it. A corruption scandal, on the other hand--possibly related to the Abramoff affair--is a possibility in the coming year.
  • A scandal will also damage the career of television psychologist "Dr. Phil" McGraw.
  • Right on schedule....

    Now for my nothing-much-changes predictions for 2006....

  • The focus in Iraq will turn to the corruption and ineffectuality of the newly elected government. The US will reduce its troop presence in the country, claiming as its justification the improving order-keeping capability of the Iraqi armed forces and police. But the insurgency will continue, public safety will be roughly as poor as it is today, the economy will weaken, and dissatisfaction with the squabbling, dealmaking and embezzlement of their elected politicians will sour the Iraqi public on democracy. Iraq will likely be in for a few more years of the same before eventually finding a Saddam-lite/Vladimir Putin-equivalent to de-democratize and restore order.

  • There will be neither a military strike at Iran's nuclear facilities (recent reports notwithstanding), nor a test of an Iranian nuclear weapon. By the end of 2006, things will be as they are today--Iran apparently barrelling towards nuclear capability, and the West hemming and hawing over what to do about it.

  • Some kind of joint Hamas-Fatah government will be established in the territories controlled by the Palestinian Authority. It will be about as effective as the current government--that is to say, not at all--at battling the chaos that reigns there. A small-but-steady stream of rockets and mortars will rain down on the areas of Israel bordering on Gaza. Israel will respond with regular incursions. Terrorists will continue to attempt to infiltrate into Israel, and will occasionally succeed. Further Israeli withdrawals, if they occur, will have as little effect as the Gaza withdrawal.

  • President Bush's popularity will continue to strengthen during the first part of the year, but will nose-dive thereafter in response to the scandal referred to in my aforementioned prediction for 2005. The Republicans will lose ground, though not control, in both houses of Congress in the November elections.

  • Interest rates will rise a little, then decline as economic growth slows. The US dollar will resume its descent after its 2005 rally. The housing market will continue to cool. Oil prices will be down slightly, and inflation will remain under control. The major stock market indices will end the year down modestly from their current levels.

  • The Conservatives will form the next Canadian government.

  • The usual suspects--Chavez, Khamenei and Kim--will remain in power, barring natural or accidental death. Kofi Annan will remain UN Secretary-General until his term expires at the end of 2006. And despite signs of trouble, Syrian president Bashar al-Assad will also remain in power through the coming year.

  • The controversial surprise popular culture hit of the year will be a fictional work (film, television broadcast, play, novel, song, or perhaps some other form) that dares to vividly depict Islamist terrorists of Middle Eastern origin as evil villains.

  • As always, these predictions were arrived at by an untrained unprofessional with a closed mind. Don't try this at home, if you value your reputation and dignity.

    Thursday, November 10, 2005

    Helpful words from the Imams of Paris, and Frank Sinatra
    Recent efforts by Imams to stop the rioting in France remind me of a story I once heard a comedian tell about how Frank Sinatra saved his life.

    The comedian was walking down an alley in Las Vegas when suddenly two goons jumped him and started beating him horribly. Just when he was sure he was going to die he heard Frank Sinatra walk up and say, "That's enough".
    "And that's how Frank Sinatra saved my life."

    The "Black leadership" is usually similarly helpful whenever the Black street gangs (AKA youths) riot in the United States. After doing their best to prevent riots by announcing the time and place the riots will occur unless their demands are met, the leaders then "appeal for calm" as they negotiate with the government. Their power is increased as much by their ability to stop riots as by their ability to start them. We know what side they are really on by what they don't say: "There is no excuse for violence; if it occurs, the police must protect innocent people by using as much force, including deadly force, as necessary."

    The (immediate) result of these French riots will be assertions that there is no connection with Islam, while at the same time the government negotiates with the Imams.

    Saturday, November 05, 2005

    Why are they rioting?
    I'm sure we've all read this many times before:
    "I've always loved visiting that part of the city. The people are so nice and friendly, and I feel warm and safe whenever I go there. Now I hear that those people are rioting. They must have very legitimate and severe grievances if they're willing to engage in so much violence."

    Actually, I've never read that. Incredibly, I have often read:
    "I've always dreaded visiting that part of the city. Vicious gangs control the streets, and even the police are afraid to go there. Now I hear that those people are rioting. They must have very legitimate and severe grievances if they're willing to engage in so much violence."

    Tuesday, November 01, 2005

    As far as I can tell, this New York Times article is completely serious. Personally, I'm not yet convinced that the earth's millions of square miles of desolate, uninhabited frozen tundra are in imminent danger of being replaced by lush, pleasant temperate zones. But who wants to take that chance?

    Thursday, September 22, 2005

    Is the de facto opening of the border between Egypt and the Gaza Strip a danger to Israel? In the short term, of course, it will allow an influx of weaponry and even terrorists into Gaza, with predictable consequences. Similarly, the rise of Hamas as a political force in Gaza seems to bode ill for the prospects of peace in the region.

    But in the long term, the real problem is more basic: a horribly overcrowded, economically hopeless, implacably hostile and violent enclave adjacent to Israel. And there really is only one plausible solution to it: the voluntary emigration of large numbers of its residents.

    As long as the Egypt-Gaza border was closed, and a nominal PA government held out the hope of relative calm in Gaza, emigration was fairly difficult (hindered by Arab countries not eager to absorb Palestinian refugees), and the inflow of international aid, combined with Israeli restraint, made life there just bearable enough to keep residents from fleeing in droves. However, the ascendancy of Hamas, with the resulting escalation in violence--which will make life distinctly more unpleasant for Gaza residents--and reduction in international aid and support, together with the open border, may be enough, taken in combination, to get the needed exodus started.

    Now, lest I be misunderstood, I'm not advocating ridding Gaza of its Arabs, or even of its Palestinians. Rather, my point is that Gaza's terrorist infrastructure (effectively tolerated by Israel) and international aid spigot are keeping a population there that otherwise would long ago have begun decamping for more promising locales. The trapped residents, however, end up with no opportunities beyond terrorism, because that is what the people who control the money and the weapons want. As long as Israel lacks either the will or the means to destroy Gaza's terrorist infrastructure, its only hope is that the flow of international money eventually gets cut off, the terrorism ends up damaging Gaza itself more than Israel, and an open border then allows residents who have lost their reasons to stay, to leave.

    Thursday, September 01, 2005

    I visited New Orleans once, fourteen years ago. Apart from my standard tourist experiences of New Orleans' "SeedyLand" theme-park-of-tawdriness atmosphere, my most vivid memory is of my departing flight from New Orleans airport. I've never in my life experienced a more chaotic flying experience. Passengers were shooed without boarding passes onto the already-late plane, where harried flight attendants looked around for an empty seat to park them in. My (assigned) seat happened to be occupied by a gentleman who smugly informed me that since his assigned seat had been taken by someone else, I'd just have to find another one. (Eventually, one of the harried flight attendants investigated, and discovered that the woman occupying the gentleman's seat was in fact on the wrong flight.) As we took off, the head flight attendant announced over the PA system that she was terribly sorry about the confusion, that never before in her career had she experienced anything remotely like this, and that she hoped we wouldn't take this experience as representative of her work, her crew or her airline. She then informed us that our flight had been randomly selected by the airline for a passenger survey, and that we would be passed around questionnaires regarding the quality of the service we had received. Although I was appalled by the whole spectacle--and indicated as much on the survey--at least some of my fellow passengers seemed to think that it wasn't all that atypical of flights leaving New Orleans.

    The shock and outrage over New Orleans' post-Katrina woes reminds me of that experience--and not just because of the chaos. What's just as striking to me is the unique scrutiny to which the local, regional and national disaster response infrastructure is suddenly being subjected. Thirteen years ago, when Hurricane Andrew devastated South Florida to the tune of 25 billion dollars, a quarter of a million people were left homeless, and over a million stranded without power--a quarter-million of them for over a week--as looters ran rampant and government personnel at all levels struggled to maintain order and care for the victims. But I don't remember a national outpouring of fury at the authorities' slow and imperfect response to that disaster. (In fact, compared with the police failures during the LA riots earlier that year, the response to Hurricane Andrew was a model of smooth efficiency.) Rather, the nation's attention focused on the (largely private, charitable) relief effort, as millions in donations were raised to help the victims recover.

    If I had to identify a single turning point in America's standards of disaster response, I would guess that the date was September 11th, 2001. 9/11 was a unique emergency, in many ways. It occurred in wealthy lower Manhattan, amidst high-rent office buildings, in a city which had already pioneered the restoration of order and safety in urban America. It was a foreign attack–-that is, a powerful unifier of the citizenry–-rather than an impersonal act of nature. And the terrible toll it exacted on emergency response personnel–-at least some of which, we should remember, might well have been avoidable–-no doubt heavily dampened criticisms of the emergency response effort in its aftermath. The result has been the popularization of an idealized image of disaster response that has greatly heightened national expectations for the government's competence and efficiency in such situations.

    Now, there's nothing inherently wrong with high expectations, and if the result is that governments across America become much more prepared, rehearsed, and capable of overseeing a speedy, effective response to any foreseeable disaster, then there's every reason to rejoice. But that process is only beginning, and while I can understand the indignance that many feel at the government's haphazard response to Hurricane Katrina, I'm more puzzled by their apparent surprise.

    Friday, August 26, 2005

    Welcome, Slate readers! For your enjoyment, here are some links to a selection of my previous rants on the subject of the judiciary's naked usurpation of democracy...

    Wednesday, August 24, 2005

    Gideon Rose, editor of the prestigious Foreign Affairs magazine, demonstrates the brain rot at the heart of the field of international relations in his latest New York Times op-ed. Rose contrasts the "overenthusiastic idealists of one variety or another" who "have gotten themselves and the country into trouble abroad" with the "prudent successors brought in to clean up the mess" by applying "realism, American foreign policy's perennial hangover cure."

    Rose defends his analysis with a truly bizarre retelling of the history of American postwar international relations, administration by administration. He begins with the "idealistic" Truman administration, whose most famous, most spectacular foreign policy success--the Marshall Plan, and its accompanying reconstruction of Europe--he omits completely, so that he can concentrate on blaming it for the Korean War (which was in fact also tremendously successful, until the Chinese intervention, which might well have been avoidable). Similarly, Rose credits the "realist" Eisenhower administration with ending the Korean War early on--and then ignores the entire seven subsequent years of mixed successes and failures, from the Iran and Nicaragua operations through the Suez fiasco to the Lebanon intervention and the U2 incident. He sums up the "idealistic" Kennedy and Johnson administrations by citing only the Vietnam War escalation, leaving out the Bay of Pigs invasion, the Berlin crisis, the Cuban missile crisis and the Dominican Republic intervention, among other events.

    The "realist" Nixon/Ford/Kissinger administrations are evaluated in slightly more detail: the end of the Vietnam War (inexplicably judged a success, despite the ignominious abandonment of Saigon), detente with the Soviet Union (also judged a success, for reasons that aren't entirely clear), and the opening to China (admittedly a more unequivocal success). Left out are five years of war in Southeast Asia, the Yom Kippur War and subsequent Middle Eastern peace negotiations, and numerous other smaller-scale involvements. Rose then proceeds to leave the Carter administration entirely unexamined, except to dismiss it as an "idealistic" failure. Likewise for the Reagan administration (although he admits it "got lucky", with that whole collapse of Communism thing).

    The "realist" first Bush administration is credited, strangely enough, with the fall of the Soviet Union and the reunification of Germany--the latter being an event over which the administration actually had little influence, and was quite divided internally. It is also credited with "the reversal of the occupation of Kuwait" (note the clever wording--leaving Saddam Hussein in power is not mentioned). The "idealistic" Clinton administration is then faulted for its readiness to confront Serbia and China, though, according to Rose, it was ultimately forced to reduce its ambitions to "marking time while dithering" (by bombing Belgrade into submission?). Again, the major focus of that president's foreign policy--the Middle East peace process--is omitted, as is Clinton's failure (understandable, I think, given the times) to deal with the looming threat of Islamist terrorism.

    Finally, Rose arrives at the current administration, which he writes off as "merely one more failed idealistic attempt to escape the difficult trade-offs and unpleasant compromises that international politics inevitably demand". Yet he concedes that the Afghan campaign was a success, while labeling the one in Iraq a "costly and bungled occupation" (without mentioning the relatively inexpensive and well-executed invasion).

    The underlying problem with Rose's analysis is that his division of American foreign policy into "realist" and "idealist" camps is ridiculously simplistic. Idealists, for example, differ widely in their goals: "diplomatic idealists", to coin a phrase, seek to strengthen global institutions, whereas "democratizing idealists" seek global democratization within nation-states. Non-idealists include political nationalists, who wish to expand America's raw political power abroad; economic nationalists, who consider American economic benefit their primary goal, and realists, who seek a global "equilibrium" in which no one country is dominant. Finally, members of any of these groups can differ over means to their particular ends: hawks prefer the aggressive use of military force; diplomatists prefer to use diplomacy; materialists prefer economic means; and isolationists eschew the use of any means as a waste of effort and resources.

    The position that Rose adopts, judging by his evaluation of past postwar administrations, boils down to a rejection of democratizing idealism as an end and hawkishness as a means. But it's not a consistent rejection--he applauds the democratization of the East Bloc, for instance, as well as America's military actions in the first Gulf War and the invasion of Afghanistan. Rather, he implicitly criticizes, above all else, protracted American involvement in military combat--particularly "preventive war"--and moral fussiness about doing diplomatic business with brutal but powerful foreign governments.

    But the former is simply a preference for quick victory--such as the Afghan campaign and the first Gulf War, of both of which Rose apparently approves. In other words, his guiding philosophy is, "wars are costly and painful, if you don't win them quickly and easily"--hardly a shocking conclusion. Likewise, his fondness for negotiating with dictators apparently met its limit with Mullah Omar of Afghanistan--a fellow who turned out to be, in hindsight, pretty easy to knock off. The lesson appears to be, "shmooze with despots--unless you don't have to"--again, not exactly a penetrating insight.

    In fact, even the above highly simplified capsule history of postwar American foreign policy provides some genuinely useful lessons, if one is prepared to look objectively for patterns:
  • Know whom, exactly, you're fighting against. The US has gotten in the most trouble--Korea, Vietnam, Iraq--when failing to understand and deal with the full breadth of the enemy. Nothing ruins an easy victory like a neighbor's intervention on the other side.
  • The stability of non-democratic regimes is notoriously difficult to assess. The Soviet Union is the classic example, but plenty of other resilient-looking dictatorships--Spain, Portugal, Nicaragua, Mexico, Argentina, South Korea, Taiwan--crumbled remarkably swiftly, while others that have seemed shaky for years--Zimbabwe, Burma/Myanmar, Jordan, Saudi Arabia, and many others--have nevertheless limped along as if they were always rock-solid. Democratic countries, on the other hand, seem to decay in a very predictable way, viz., Russia, Venezuela.
  • Negotiating with dictators doesn't necessarily mean being nice to them. Ronald Reagan negotiated with the Soviet Union while calling it an "evil empire". Nixon was no less blunt--nor any less willing to deal. On the other hand, the leaders who improved the atmospherics, hoping for genuinely improved relations--Carter with Brezhnev, Clinton with Arafat--generally ended up getting shafted.


  • Now, these are all rather trite, simple observations, and I'm sure that a serious international relations scholar, studying postwar American foreign policy in more detail, could arrive at more sophisticated conclusions. Why, then, do they instead so often end up sounding pretty much exactly like Gideon Rose?

    Tuesday, August 09, 2005

    You may never in your life hear any music more bizarre and disturbing than this.

    Sunday, July 17, 2005

    As a result of certain recent events, I happen to have been strategically placed to witness the United Kingdom's reaction to the recent bombings of the London transport system. A few observations:

  • The reputation of the BBC notwithstanding, I found the political bias of British television news to be comparable to that of American network news--and hence, quite possibly more in line with its intended audience's views, given the generally more left-leaning British body politic. Of course, the bombing itself may have shifted the tenor of the reporting I saw--it was for the most part resolutely unsparing in its portrayal of the act and its perpetrators as evil and horrible, and only occasionally drifted into soft-on-Islamist-extremism hand-wringing.

  • It has often been noted that British journalists are much less deferential to their interview subjects than their American counterparts. And it's true that their questions are often delivered in an antagonistic, skeptical tone, casting doubt on each premise that the interviewee has been summoned to defend. In fact, though, the confrontational displays I saw were all just a cleverly disguised version of the American softball style: the interviewer mimicked a caricature of the interviewee's opponents, threw out a few lame challenges in that voice, and let the interviewee demolish them. This format naturally pleases everyone involved: the interviewer ends up looking tough, the interviewee looks clever and articulate, and the gullible audience imagines it's just seen a lively debate.

  • An illustration of the difference between American and European attitudes towards authority: I saw a discussion of civil liberties in the wake of 7/7 (as they call the bombing there) in which an outspoken civil liberties advocate suggested that perhaps the proposed budget for a national ID card system might be better spent on more police. How many American civil liberties activists have ever advocated such a thing, even right after 9/11?

  • One of the oddest aspects of the coverage was the enormous attention devoted to the supposedly terrifying possibility, later confirmed, that the bombings were the work of "suicide bombers". At first, I thought the prospect highly unlikely. After all, the use of suicide bombers makes a certain amount of sense when attacking targets that are well-defended against more conventional tactics--say, when the only way to breach the security around a Marine base is to drive a truck into it and detonate it immediately, or when a highly alert population will recognize and flee any package that's not in anyone's immediate possession. The London train and bus bombings, on the other hand, hit soft targets that would have been equally vulnerable to timed or remotely-triggered package bombs. I can personally vouch for the fact that many crowded trains in the UK have luggage racks where large bags sit far away from their owners--possibly miles away, for all anybody can tell. Of course, all that may soon change. But I suspect that any terrorist campaign that results in the British being as vigilant and preoccupied with security as Israelis already are, would be judged a resounding victory for the terrorists.

    The suicide bombing tactic, on the other hand, has many disadvantages. For one, extra effort has to be made to cultivate and indoctrinate operatives until they can be relied upon to "push the button" when the time comes. They must also be trained to appear completely normal, to avoid suspicion, for some substantial time interval immediately prior to blowing themselves to smithereens. Moreover, soon after their training and indoctrination have been completed, they are quite emphatically eliminated in a manner that renders them categorically unavailable for use in future operations. In the process, their bodies become rich sources of information for investigators--indeed, several of the London bombers were apparently identified by documents they still carried in their pockets when they attacked. Obviously, the sooner the perpetrators are identified, the sooner their activities can be traced, and their colleagues and mentors investigated.

    Presumably, the reason for opting for suicide bombers in this case was their predicted psychological impact, and judging by the recent reporting, this strategy has been quite effective so far. However, I expect that its shock value will wear off fairly quickly, as it becomes clear that impressionable fanatics willing to die at their leader's request are simply not all that hard to find. After all, so far, more citizens of Western countries have killed themselves because an odd-looking elderly man told them they would thereby meet an alien spaceship associated with a comet, than have done so because someone told them they would thereby enjoy 72 virgins in paradise.
  • Tuesday, July 05, 2005

    In honor of my honeymoon, this blog will now observe two weeks of silence.

    Friday, July 01, 2005

    Speaking of the real estate bubble, The Volokh Conspiracy's Todd Zywicki has an, er, interesting take on "exotic mortgages" such as "interest only" mortgages, where the purchaser spends a few years only paying interest on the mortgage, before actually beginning to pay down the principal and acquire equity in the home. The problem with paying down the principal on your mortgage, he writes, is that it "would increase the covariance of your human wealth with your home equity"--that is, it causes you to invest in the same locale where you earn your income, increasing your vulnerability to a local economic downturn. (A bad local economy might cost you both your job and a chunk of the property value of your home.) Zywicki's solution: "we want to encourage individuals to minimize their principal payments on their houses and instead to diversify that money into financial assets that will diversity [sic] away from the risk associated with human capital."

    Well, there's a perfectly reasonable, long-accepted way for households to "minimize their principal payments on their houses and instead to diversify that money into financial assets that will diversity [sic] away from the risk associated with human capital." It's called, "renting". Holders of interest-only mortgages, on the other hand, pay a premium--usually a hefty one--over the market rent for the same property. That premium in effect purchases an option to buy the house at a later date at a fixed price. (After all, until the mortgage-holders start paying the principal, they don't actually own any equity in the house.) This option is purely speculative: it amounts to a bet that the same house will be more expensive to buy once the principal payments start. If house prices stay constant, then the difference between the interest paid and the cost to rent an equivalent house during the "interest only" period will have been completely wasted.

    It's true that a mortgage is always an investment on margin, and hence at least partly speculative. But option plays such as interest-free mortgages are purely speculative, and one wonders what ordinary householders are doing engaging in them--or what supposedly economics-savvy law professors like Zywicki are doing endorsing them.
    It's amusing to see conservatives taking a brief breather from fulminating against the Imperial Judiciary (a favorite hobby of mine, too, as readers well know) to lambaste the Supreme Court's recent failure of Imperial will in the Kelo decision. Apparently, when it comes to the municipal government requiring you to sell your house to a developer, it's just fine, say conservatives, for the Supreme Court to run roughshod over elected, democratically accountable representatives of the public. I've long predicted that control of the nominations process would give conservatives Strange New Respect for judicial tyranny. It's always fun to be proven right.

    But Marty Lederman and Nicole Garnett of SCOTUSBlog raise an interesting question: why has this particular ruling--which was completely in line with precedent, and hardly unexpected--aroused such righteous ire from conservatives?

    To me, the answer is obvious: real estate is the cultural-financial obsession of the moment, the way stocks were eight years ago. For the Supreme Court to rule at this time that the government can take away your house--that is, not just your life savings, but your hot investment, your ticket to riches, your chance at an early, cushy retirement--is threatening in a way that it wouldn't have been, say, ten years ago. After all, the law still requires that "just compensation" be paid to the the owner of confiscated property. In normal times, that would no doubt be enough to ease any worries. But today, when everyone assumes that the real estate price elevator only goes in one direction--that is, up--the current market value of a home hardly seems like "just compensation". Why, who knows how soon the price of that confiscated home will be double, triple, quadruple what it is now?

    Sunday, June 26, 2005

    Bloggers are quite understandably having lots of fun ridiculing this unbelievably asinine New York Times op-ed column, in which a Muslim Arab-American Harvard student contrasts her chance encounter with a polite, gentlemanly Al Gore with the "everyday hostility" she claims to endure as a headscarf-wearer in Cambridge, Massachussetts. Sure, it's fun to laugh at the author's sophomoric superficiality, spiced with a pinch of breathless celebrity-fawning, as she smugly pronounces herself "frustrated and angry", and charts her declining "faith in the United States", while watching the news headlines on the televisions at the gym where she works out. But the bloggers' fully justified mockery misses by far the bigger irony in the piece.

    The author's "alienation" from "what is supposed to be [her] country", she writes, is partly a product of the "stares" she receives when she wears her headscarf. Now, I'm sure it's no fun for her to be stared at. But why, then, would she also object to "America's conflict with the Muslim world"? After all, she describes herself as "fresh-faced and comfortably trendy", and her encounter with the former vice president takes place in a gym where she goes "just about every morning"--alone. Doesn't she realize what her fate would be as a single woman visiting a male-attended gym, alone, in "comfortably trendy" clothing--with or without a headscarf--in just about any Arab country? Stares, I daresay, would be the least of her worries.

    Luckily for her, she lives in America, along with "10 million Arab and Muslim-Americans, many of whom are becoming increasingly withdrawn and reclusive". If she thinks she's entitled to blame her government's foreign policy for making her withdrawn and reclusive, perhaps she should try living for a while in her ancestral homeland, where being withdrawn and reclusive is, for women, a form of self-preservation, not a petulant political reaction.

    Saturday, June 25, 2005

    Where Have All the Frogs and Toads Gone?
    According to this article:
    Frogs and toads are becoming extinct all over the world. It's the same magnitude event as the extinction of the dinosaurs.
    What is the reason?
    Toads and frogs are dying out under pressure from the expansion of agriculture, forestry, pollution, disease and climate change, NatureServe said.
    In other words, they haven't got a clue.

    Tuesday, June 21, 2005

    Canada's Vaunted Health Care System
    There has been some confused discussion lately about the issue of private health care in Canada. Canadians are the most confused about this, and they have been ever since most private medical care was made illegal about 20 years ago. In fact, most of them (based on a survey I've done consisting of frequent chats) don't even know that most private medical care is illegal, and the reason they don't know this is because the language used to discuss the issue is so bizarre and obfuscatory.

    Instead of announcing that they were outlawing most private medical care, the Canadian government(s) merely announced that they were eliminating "extra billing". "Extra billing" was the practice of a doctor billing the government health plan for a service, and at the same time billing the patient an additional (usually small) amount. If a doctor charges a patient and does not charge the government then this is completely private medicine and should not be called "extra billing". Nonetheless, under the guise of eliminating extra billing, virtually all private medicine was outlawed in Canada. (Fine Print: private medicine that was not claimed to be covered by the government, such as dental work and cosmetic surgery, remained legal. Also, very recently, some private MRI clinics have been permitted in some provinces.) After being used as a subterfuge for outlawing private medicine, the phrase "extra billing" was never (to my knowledge) used again. Instead, whenever the issue arose, it was described as being about "single-tier versus two-tier health care". "Single-tier" was the good thing, the status quo, where every Canadian (except the rich, the powerful, the well-connected, the ...) had access to the same level of health care. No one discussed why two-tier (which should, of course, be called continuous-tier) is okay when it comes to housing, vacations, etc. Outside of Canada, the term "single payer health care" was used to describe the Canadian system, virtually guaranteeing that almost no one would understand the situation.

    This article in the New York Times discusses a recent Quebec court decision that kinda says that the government doesn't have the right to outlaw private medical care. Or maybe it says that the government doesn't have the right to outlaw private medical insurance (implying that private health care is already legal, just not private medical insurance). Or maybe it says that it's okay to outlaw private medicine as long as the government provides good public medical care. Actually, I have no idea exactly what the court decided, and the author of the article clearly couldn't care less.

    But he does care to tell us that Canada's health care system is "vaunted" and "is broadly identified with the Canadian national character" and that this decision is a "blow to Canada's health system". The fact that "Canada is the only industrialized county that outlaws privately financed purchases of core medical services" is presented as a positive fact about its national character, and no explanation is given about why the existence of private medical care in England, France, Germany, Sweden, ... has not been a blow to those countries' health care systems.

    The article tells us that this vaunted system has long waiting lists for "diagnostic tests and elective surgery", but it omits the fact that patients often wait over two months for cancer treatment. Or the fact that my friend who was unable to move because of sciatica was told that he had to wait over a month -- and risk paralysis -- before he could see the appropriate specialist. (He received faster treatment because of the connections of one of his friends.)

    And what -- except for the horror of two tiers -- is the reason for outlawing private medical care? After all, one would think that for any given level of public expenditure, allowing private care would improve the level of medical care for everyone. The only reason given in the article is that "a two-tier system will draw doctors away from the public system, which already has a shortage of doctors ...". This is not the way things work with housing or mail delivery, but I suppose it's possible that the quality and quantity of doctors is fixed and independent of demand. Except that two of the three main Canadian parties -- the Liberal and the NDP -- actually claimed that there were too many doctors! And the NDP government of Ontario actually took positive measures to reduce the number of doctors:
    By reducing the number of first-year medical students this fall, the University of Toronto takes a leap toward improving the health-care system.
    The fact is that Canada will have as good a health care system as the government is willing to fund, and allowing private health care will only make it better.

    I feel that this horrible "too few doctors" argument holds the clue as to why private health care was made illegal and why the public system became so bad. In fact, according to my completely unscientific study, the public health system started to go into sharp decline right around the time that private care was illegalized. The government(s) wanted to reduce public health care expenditures by reducing the quality of health care offered, and my theory is that they felt this would be better accepted by the people if the people had no basis for comparison to see just how bad things were becoming. Of course people knew what was available (for many) in the United States, but this idea still worked for many years and allowed the state of Canadian health care to deteriorate badly.

    The argument that became "don't allow private medical care, or else people won't support our crappy public system" started out as "don't allow private medical care, so that people will allow our system to become crappy".

    Saturday, June 18, 2005

    Mark Kleiman is concerned about what he calls the "TGIF problem": that people don't enjoy their jobs enough. "What does make me unhappy," he writes, "is that, in what is by some measures the richest nation in the history of the planet, most people don't really enjoy the activity that occupies about a third of their waking hours."

    It's widely considered an ideal, of course, to be able to do for a living what one loves to do anyway. But let's face it--we can't all be prostitutes.

    That may sound like a flip comment--okay, it is a flip comment--but I would argue that there's also an important truth behind it. The usual explanation for why most people don't hold jobs that involve doing what they love is that there's nothing that they love to do that they could actually get paid to do. But that's usually not true. In fact, almost every hobby has a corresponding job, and many of those jobs in fact employ millions of people. Lots of hobbyists, for example, enjoy gardening, or woodwork, or auto mechanics, or cooking--or, for that matter, sex.

    But doing these things as a hobby is very, very different from doing them for a living. A hobbyist pleases him- or herself, makes his or her own hours, and works on the projects--and even the aspects of a given project--that he or she enjoys working on, skimping on the parts that are less enjoyable. A professional, on the other hand, must please employers or customers, by working on giving them what they want, when they want it, the way they want it. It's hardly surprising, then, that professionals often view their work as little better than an exhausting succession of unpleasant tasks.

    Indeed, doing what one loves, but for a living instead of for fun, often drains all the joy out of doing it--my flip comment above being an obvious example. And to me, that's much worse than having to do a tolerable job one would not normally perform voluntarily. After all, in the latter case one is still free to enjoy one's unspoiled passions fully in one's spare time.

    Kleiman points to professors--many of whom prefer to keep working through retirement age, as "emeriti"--as exemplars of the ideal of doing one's favorite activity for a living. But academia is something of a special case. There really aren't that many full research professors, as a fraction of the population. They are carefully selected by a rigorous process that requires them to demonstrate their willingness to obsess about their work for years on end. And then they are then given tenure, which allows them considerable freedom to tailor their work in just about any way that pleases them.

    Society can afford one or two sparsely-populated vocations like that. But a world of tenured doctors or tenured middle managers or tenured electricians would look a lot like--well, the world of tenured public school teachers. And thank goodness most of North America functions better than its public schools.

    Of course, there are people who are blessed with a love of a particular activity so intense that they even enjoy pursuing it professionally, despite all the drawbacks of doing so. Such people are very lucky--as are people with a particular talent so immense that they can exploit it on their own terms, as if it were a hobby, and still make a good living. But for the rest of us, the sensible course of action is to find a real job, and save our cherished pastimes for after hours.

    So I say to the myriad editors who are no doubt ready to offer me untold riches, if only I'll agree to turn my blogging hobby into a career in professional opinion journalism: save your breath--this blogger's not for sale.

    Friday, June 10, 2005

    The 1961 newspaper article by Peter Benenson that led to the founding of Amnesty International is a fascinating historical document. It harkens back to the era when "freedom of speech" was a rallying cry on the left, rather than the right--presumably because at the time, the left considered itself an insurgent political movement rebelling against an established order, not as the established order itself. But Benenson's vision of a worldwide campaign for the release of "prisoners of conscience"--citizens jailed and mistreated merely for voicing political dissent--also appealed to the right wing of his day, as a way to call attention to political repression behind the Iron Curtain. No doubt this bipartisan appeal was one of the chief reasons for AI's remarkably long-lived international prestige. It also helped stave off the "mission creep" that can send altruistic organizations off on foolish tangents, or diffuse their efforts until their purpose loses all coherence.

    The end of the Cold War, though, took away the external pressure on Amnesty to maintain its bipartisan discipline. The result can be seen in AI's 2005 Secretary-General's message, where Amnesty's original goal of defending dissidents against political repression barely rates a mention. Instead, we read about "economic and social rights", "HIV/AIDS, illiteracy, poverty, child and maternal mortality, and development aid", and "violence against millions of women….including genital mutilation, rape, beatings by partners, and killings in the name of honour". The two countries that come in for the most criticism are Sudan and the US—Sudan for mass murder in Darfur, and the US for denying due process in some cases to "'suspected terrorists'". And most famously, the Secretary-General asserts that the US detention facility at Guantanamo Bay for suspected strategically important Al Qaida and Taliban members "has become the gulag of our times".

    Plenty of commentators have given AI grief over this comparison, attacking it as a gross error of scale and a symptom of leftist bias. Meanwhile, AI's defenders on the left have treated the analogy as a minor exaggeration distracting from the real issue of American brutality. But few--the Washington Post's Anne Applebaum and Reason's Cathy Young, both accomplished Sovietologists, being notable exceptions--have called attention to by far the most objectionable aspect of the Secretary-General's odious comparison: that she lumped the Soviet Gulag, whose main function was to punish Soviet citizens suspected of insufficient loyalty to the Soviet regime, together with Guantanamo, whose detainees, whatever else one might think of their treatment, are most certainly not being held in an effort to silence domestic political opponents. And no other commentator that I've seen has recognized this failure to distinguish between these cases as a betrayal of AI's own founding ideals. Indeed, in the world of activist organizations--led, of course by the UN--the entire concept of freedom from political repression has been so completely subsumed under the ridiculously broad rubric of "human rights" as to have completely disappeared from view, just as it disappeared from the AI Secretary-General's message.

    Consider, for example, Amnesty's appeals for June 2005. They include a German-born Turk arrested in Pakistan and held at Guantanamo; a Palestinian under "administrative detention" in Israel; and a woman accused of complicity in a series of bombings in Uzbekistan. Let us put aside, for the moment, the odd fact that all three appeals involve Muslims imprisoned by the US or one of its allies on suspicion of participation in terrorist activities. The more striking fact is that two of the three are being held by countries with a democratic government and abundant political freedom. A glance at previous months shows a similar pattern: of the fourteen appeals for the four months prior to June, exactly one involved imprisonment of a political dissident in a dictatorship, and two more involved claims of religious persecution in a dictatorship. (Oddly enough, both involved central Asian former Soviet republics.) Three more involved general brutality by repressive dictatorships; the rest--fully half--involved democratic countries with free speech and an active popular press.

    Now, the point here is not to assert that democracies are somehow morally infallible. The point is that governmental misdeeds in a free, democratic country are subject to public discussion, judgment and action--that is, they are political matters. There is no need for a worldwide appeal on behalf of the victims of injustice in such countries--local journalists and "human rights" activists are generally happy to do the job of publicizing such cases themselves. Amnesty International was established in the first place on the premise that only external pressure, such as from an international campaign, can coax a despotic regime to refrain from brutality towards its domestic political opponents. Evidently, the members of AI itself have forgotten the original premise behind their organization's existence.

    In a kind of final repudiation of its original mission, the head of AI's American branch recently called on all 190 signatories to the Geneva Conventions--free democracies and repressive tyrannies alike--to arrest and prosecute senior US government officials, such as Defense Secretary Donald Rumsfeld, for their role in alleged incidents of torture in US detention facilities. That is, the AI that was founded to rally the world's democracies against repressive governments that imprison political opponents, now rallies repressive governments to imprison the freely elected political leaders of the world's preeminent democracy. Peter Benenson's vision may live on, but the organization he founded has, for all intents and purposes, destroyed itself.

    Tuesday, June 07, 2005

    Something is Very Wrong With 100% of the Faculty at MIT
    Many people are writing about all the "diversity" nonsense at Harvard -- notably Heather MacDonald -- but almost no one is complaining about the sins of her sister institution just down the river.

    In a long article in the MIT Technology Review with the frightening title "Diversity Pledge", we learn that the MIT faculty has unanimously pledged that "within a decade, MIT is to double the percentage of minorities on its faculty and triple the percentage of minority graduate students". The article is mind-numbingly awful and you should read as much of it as possible. John Rosenberg, the only person I know of who has written about it, points out that it is hard to reconcile this Pledge with the fact that its author, Prof. Rafael Bras, insists that people be "treated fairly within his department regardless of race or gender".

    But let's back up and try to understand the Pledge a bit better. The reader may well be confused, since (probably) the MIT faculty and student body consists mainly of minorities. Now many people who are not scientists use words like "minority" to mean whatever they want them to mean, without even thinking about it. MIT-trained scientists, on the other hand, do it completely consciously. The article tells us that
    MIT defines underrepresented minorities as African Americans, Native Americans, and Hispanics -- populations with disproportionately few members working in science, technology, engineering, and math. (Any subsequent mentions in this article of "minorities" at MIT refer to people from these groups.)
    So the real reason that minorities are underrepresented at MIT is because that is how they are defined. Nonetheless, most of the article is a totally incoherent attempt to offer explanations for, and solutions to, this problem.

    Apparently part of the problem is MIT's fault, as evidenced by the harrowing tale of Prof. Bras. Bras experienced racism in Boston/Cambridge when he came there as a student, and
    MIT, Bras says, has been no different in this regard than the rest of the country. Which is why, when he became a professor of civil and environmental engineering and, later, head of the department, he made diversity his 'personal agenda item' ... In 2003, Bras became chair of the MIT faculty, and his scope was now Institute-wide.
    Perhaps this is the "unconscious, invisible bias" that Nancy Hopkins (more about her below) complains about. And there are other, equally blood-curdling examples of what happens to minorities at MIT:
    Faculty and grad students interviewed for this article say their experiences have been mostly good. But discrimination still exists. "We are a large community here, and there have been incidents of blatant racism. There have been insensitive remarks," says Bras, who adds, however, that he personally has found MIT to be very supportive. Hernandez tears up when he talks about the "warm, welcoming community" he discovered at the Institute. Assistant dean of grad students Jones, who is African American, says the isolation and self-consciousness he felt as a graduate student were what all students -- minority or not -- feel at times. That said, he notes that minorities do not have the networks in place to help them deal with those feelings.

    "A nationwide shortage of minorities in science and engineering" is mentioned briefly, but otherwise is not addressed. The article states that for potential graduate students, money is also a problem, since minorities tend to be poor. No solution is offered to this, and thankfully so, since if MIT started paying higher stipends to poor graduate students then the wrong kind of poor person just might wind up coming, yielding the wrong kind of Diversity. The fact that "In some minority communities, an advanced degree in science or engineering carries distinctly less status than other professional options" is also part of the white man's burden at MIT. In addition, "many minorities think getting into MIT is unachievable", although no explanation is given why so many minorities-that-we-don't-call-minorities don't feel that way.

    But it is clear that all these problems must be overcome, and the Pledge must be successful. Why? A semi-coherent argument is made that more racial diversity is important for urban studies. This is contrasted with the fact that "there is no 'black' physics", but the physicists and everybody else appear to feel otherwise. "Look at all this incredible talent we're missing" and "different perspectives enhance creativity" are typical comments. Also, "many faculty agree that putting more effort into diversifying the university only increases the quality of its students and faculty"; presumably all the rest of the faculty support the effort even though they don't believe this. And then there is that pressure coming from above.
    Federal funding agencies, such as the National Institutes of Health, now require grant seekers to show that they are actively working to improve diversity before their funding requests will be certified. Although NIH doesn't set quotas, it wants to see evidence that grant recipients are succeeding in their outreach efforts. "The sabers are rattling," says Isaac Colbert, MIT's dean for graduate students. Funding agencies have to meet their own diversity goals in hiring, Colbert says, and since they "get their employees from us, they need people to fill the pipeline."
    Thank God for all those non-quotas.

    So how should MIT go about achieving its Pledge? Of course, no wailing about missing minorities would be complete without comparing them to and confusing them with Women, and here the infamous Nancy Hopkins enters the picture. Professor Hopkins (discussed here) is the main author of a bogus report accusing MIT of discriminating against women. "That report rallied women faculty to the cause and helped improve the campus cultural climate. Since then the population of women faculty members has grown by roughly 25 percent". No mention is made of why it didn't rally 100% of men faculty to the cause. Nor is it pointed out that the case of women is very different from the case of minorities since, as pointed out above, minorities are not being discriminated against; in fact, it appears that 100% of faculty wish to discriminate in favor of them.

    Oops! Not quite, for nowhere in the article is it stated that MIT will have lower standards for minorities. That would be as unthinkable as quotas. Instead, MIT will use lots of different kinds of "outreach" programs, similar to those that succeeded in increasing the number of women. (Presumably these were more successful than merely removing discrimination against women.) For women, this worked as follows:
    The department created a central search committee to coordinate all searches and staffed it with faculty members particularly committed to diversity. The committee was aggressive in seeking female applicants. Members called colleagues at peer universities and asked them to recommend recent graduates or otherwise tracked down attractive candidates and invited them to apply. The committee also expanded the scope of its search to include related disciplines where there were larger populations of women. If a talented female candidate didn't fit the criteria for one job, a committee member might recommend her for another or even to another department.

    Special funding available to departments through what is called a "bridge slot" also helped the School of Engineering attract more women. A decade ago, the MIT provost's office decided it would fund faculty positions for five years if they went to senior women. After that, funding would have to come out of the annual budgets of the new hires' departments.
    Standards aren't lowered, just changed. And men aren't discriminated against, there's just no budget left for them.

    Here are some examples of outreach programs for minorities.
    One of the programs is a 10-week summer research program for minority college sophomores and juniors; according to Jones, 17 percent of the students who have participated in it have ended up at MIT. ... And at an MIT open-house weekend last April, the department hired an organization that specializes in giving tours of Boston that highlight the city's racial and ethnic diversity. The department's efforts have paid off: 80 percent of minority applicants accepted last year decided to study at MIT.
    The first of these programs seems to me to be illegal, even by the weird standards of the Supreme Court. And what about the 83% that did not end up at MIT? My guess is that most of them were rejected, but it would be nice to know. Also, is 80% higher than the standard fraction of accepted minority applicants who come to MIT? And what became of the other 20%? I suppose it's possible that those who were not saved by MIT wound up in prison, but I suspect they are more likely to be found at Harvard, Stanford, etc.

    Of course, none of this can work without internal reorganization and internal pressure at MIT.
    Regarding faculty searches, all departments must share information about faculty and graduate student candidates with [provost] Brown. He in turn must report annually to the MIT faculty, the Faculty Policy Committee, and the Council on Faculty Diversity about the progress schools and departments have made. ... Bras and the faculty chair-elect (Bras steps down this summer) have been visiting every department "so the issue won't get lost," he says. ... After the resolution passed, the office hired Chris Jones, ... a new assistant dean who will work both internally and externally to recruit minority students to MIT. ... Internally, the Institute needs to make minority students feel more welcome, says dean for grad students Colbert. ... "I'm talking about bringing out the human element." One way to do that is to find what Colbert calls a "faculty champion" in each department, someone who will work with the Graduate Student Office to reach out to potential students.
    At least MIT isn't imposing quotas on departments. And best of all, there is no mention in the article of "affirmative action".

    Sunday, June 05, 2005

    Most Harmful Books?
    This list of the most harmful books of the 19th and 20th centuries has gotten a lot of attention and a lot of criticism (Darwin is harmful?), but I have two severe criticisms of my own: one inclusion and one exclusion.

    The wrong inclusion is Hitler's Mein Kampf. What harm did this book do? Almost nobody bought it when it first came out. It became popular later because Hitler was popular, but there is no reason to believe that anybody read it much, or that it had any influence whatsoever. On the other hand, if enough people had read it, Hitler's plans might have been taken more seriously. On a related note, I think the world would be a better place if more people read what this man has to say.

    The book that I feel is wrongly excluded from the list is Erich Maria Remarque's "All Quiet on the Western Front", published in 1929. A typical view of this book, given by an Amazon reviewer is, "This is the greatest war novel ever because Remarque's book is anti-war."

    Anti-war. What does is mean? It doesn't mean "war is hell", a sentiment no one disagrees with. "Anti-war" means that in any war, both sides are equally wrong, and either side would be better off making whatever unilateral concessions are necessary to end the war, or to make sure that it doesn't start in the first place. Unlike Mein Kampf, Remarque's book was very widely read, and its lessons were very well learned in Europe (except for Germany, where Hitler burnt it). War Must Never Happen Again. Or at least it must be delayed as long as possible. And made as severe as possible. And millions of people must die.

    Of course, there were anti-war influences besides Quiet, and I'm not even sure how Remarque meant his book to be interpreted. But to the extent that any one book can kill 50 million people, this book did.

    Saturday, May 28, 2005

    Oxblog's David Adesnik discusses the interesting question of whether the "Star Wars" saga--and the recently-released Episode III, in particular--is a political allegory, and if so, what lesson can be drawn from it. Apparently, George Lucas himself sees in it echoes of the US military invasion of Iraq, and an answer (of sorts) to the question, "how does a democracy turn itself into a dictatorship?". Others draw the same analogy, but from the opposite side, arguing that the Jedi that Lucas so obviously admires are in fact the real villains of the story.

    Well, now that I've finally seen the film, I can say with some confidence that a few of the lines Lucas slipped into the film are intended to express his own rather conventional Hollywood political take on current events. The Jedi and the Senate are said to stand for "democracy" and the Republic, but the Senate is manipulated and eventually controlled by an evil chancellor/emperor, who conjures up a war for political reasons, and uses it to arrogate dictatorial powers to himself. At one point, a prominent female character even gives voice to some vaguely antiwar sentiments that sound more like the naive mumblings of a Hollywood starlet talking about Iraq than the words of a Senator of the Galactic Republic battling a rebel army of androids. The intention is clear, if somewhat jarring, given the tone of the rest of the film.

    (Then again, it's hard to call anything "jarring" in a movie with dialogue this stilted, or with plot and characterization this incoherent. For example, Yoda briefly descends at one point into Hollywood-guru Buddhism, urging Anakin to "let go of everything you fear to lose"--not long before slashing people dead left and right with his light saber, apparently out of fear of losing the Republic.)

    The political digressions notwithstanding, though, the overall story is completely unrelated to anything so modern and complicated as democracy. In fact, it represents much more conventional legendary fare: the age-old struggle between church and state. The Republic is best considered as a traditional tribal/national federation--think Iroquois Federation, Holy Roman Empire, ancient Israel under the prophets--in which clans, tribes or nations sharing a common religious allegiance agree to manage their conflicts within the framework of their shared traditions. The Jedi are a religious warrior class--think Crusaders, Jihadis, Maccabees, Samurai--whose code includes protection of the existing political order. The emperor is a warrior king--think Nebuchadnezzar, Alexander, Genghis Khan, Napoleon--who seeks personally to exercise absolute power over as much territory as possible, and views both religion and traditional tribal politics as a threat to his absolute sovereignty. This type of struggle among religious obligation, tribal loyalties and military ambition probably dates back to the struggle between prehistoric warrior-chieftains and medicine men, runs all the way through the Bible, and infuses numerous famous legends, including, for example, those of Robin Hood and King Arthur.

    Of course, none of the parties in this eternal struggle has much to do with modern democracy. All, in fact, depend on undemocratic, or rather pre-democratic, notions of government--theocracy, tribal authoritarianism, or military tyranny. As I've mentioned before, democracy is a highly counterintuitive invention that proves useful, and therefore durable, once it catches on, but actually seems quite odd and implausible before that point. It therefore makes terrible material for a supposedly timeless heroic legend. I suspect that whatever his politics, George Lucas instinctively knew how foolish it would have been to try to build his space-opera mythos around it.

    Thursday, May 19, 2005

    Mark Kleiman points approvingly to public policy professor Michael O'Hare's rather bizarre proposed solution to the problem of music copyright protection: simply have the government pay musicians to create music. The idea is for music makers to allow anyone to record, play, trade, distribute and otherwise use their music--in return for a government subsidy, which would presumably be in proportion to their music's "popularity".

    Let us put aside for a moment the enormous temptations to corruption and politicization that would beset any government effort to compensate artists based on their popularity. What purpose is such a scheme even meant to serve? What is the social value of compensating musicians for their popularity, anyway?

    Traditionally, the function of intellectual property is to spur creativity. Patents, for instance, create an incentive for inventors to come up with useful inventions, by granting them a limited-term monopoly on their proceeds. By the same token, compensation for the creation of music would presumably have the goal of spurring the creation of new "good" music. How important is that?

    One way to answer that question is to look at forms of music where intellectual property is less prominent a factor. Instrumental jazz, for example, is largely a performance art form, with many accomplished players obtaining little compensation from recordings. Those who earn a living from their music do so largely through performance fees--as they presumably would under the proposed government compensation scheme, since their music has little mass appeal.

    Now, one could certainly say that instrumental jazz musicians don't work overly hard at creating wildly popular music. After all, their product is appreciated only by a niche audience. But they work very hard indeed to meet their own, and their audience's, definition of quality. And by that standard, they certainly succeed. Few fans in cities with any significant jazz audience find themselves bereft of good jazz musicians to listen to. And the top musicians tour constantly all over the country, bringing their music to clubs even in relatively obscure locales.

    Why, then, would we expect musicians who produce popular music not to do the same? After all, their music isn't any harder to create than jazz. Their concert revenues would almost certainly be larger than that of jazz musicians. And they would have an additional incentive--great fame and widespread adulation--that jazz musicians can never realistically hope for.

    Of course, most of the money earned today as a result of intellectual property rights to music goes into music marketing anyway, not music creation. And perhaps music marketers would work less hard under a regime in which music is not granted intellectual property rights. Is that really such a bad thing?

    Monday, May 16, 2005

    There's no question that Newsweek's now-retracted Koran-in-the-toilet story was a journalistic fiasco. But numerous commentators have made two assertions with which I must sharply disagree:

  • Newsweek's mistake was so egregious because its article accused America of a heinous act; and

  • Newsweek is responsible for the rioting, and resulting deaths, that followed its publication of the false story.

  • Let's deal with the second point first. Even if the riots really were provoked by nothing more than the Newsweek article, Newsweek can hardly be held responsible for the violent acts of others who happened to have read mistaken reports in their magazine. But in fact, there's every reason to believe that a murderous mob stirred up in response to a Newsweek article would have been happy to have been stirred up by just about any convenient pretext. Indeed, there's ample precedent for Islamist riots responding to completely false reports generated by untrustworthy sources. Often, these riots are carefully planned and prepared for reasons that have nothing to do with the ostensible provocation, and there are plausible claims that this one, too, falls into that category. Under the circumstances, Newsweek's role in causing these riots was most likely pretty minor.

    As for the supposed horror of flushing a Koran down the toilet--well, all I can say is: if it had been a Bible flushed down the toilet, the Supreme Court would have stepped in by now to prevent the US government from objecting to it. Sure, US interrogators flushing a Koran down a toilet might offend some devout Muslims. But then, interrogating suspected terrorists no doubt offends some devout Muslims, as well. What matters is not whether violent Muslims in Afghanistan object to American interrogation techniques, but rather whether mistreating a Koran is within the bounds of American standards of interrogation--which, after all, includes some fairly harsh treatment of the interrogated prisoners themselves. And while it's hard for me to imagine Koran-flushing actually being a useful technique, neither do I see a compelling reason for excluding it on moral grounds, rioting Afghans notwithstanding.

    Thursday, May 12, 2005

    There's something about procedural rules that brings out the hypocrite in just about everyone. Power Line's Scott Johnson catches distinguished Minnesota politician Walter Mondale succumbing to the temptation, with the support of his local newspaper. Mondale recently published an op-ed defending the Senate filibuster, now under attack by Republicans frustrated over their inability to confirm Bush administration judicial appointments. The newspaper, the Minneapolis Star-Tribune, now concurs with Mondale. A little over a decade ago, though, the paper condemned a 1993 Republican filibuster of a Clinton administration spending bill. And less than a year later, it endorsed a campaign to end the filibuster altogether. Mondale, in fact, was one of the leaders of the successful 1975 Democratic move to reduce the number of senators needed to break a filibuster from 67 to 60.

    Of course, those were different times. In 1993 and 1994, for example, a Democratic president confronted a rambunctious Republican minority who were as enthusiastic about exercising their minority prerogatives as the Democrats were about limiting them. Not only were Republican senators filibustering Clinton spending initiatives, but conservatives were enthusiastically embracing the general idea that certain government decisions--in particular, tax increases--should be subject to a minority veto. The Heritage Foundation endorsed a Constitutional amendment to enforce the principle in 1996, and the Republican-controlled Senate voted in support of it in 1998. Numerous states--mostly Republican-controlled--have enacted some form of it.

    One can, to be sure, make specific arguments for limiting the supermajority requirement to tax increases, or judicial appointments, or even poet laureate nominations. But in practice, any faction that sees itself as secure in its dominance tends to oppose supermajority requirements altogether, while factions that see their grip slipping, or not yet firm, will see supermajority requirements as an important equalizer against the powers-that-be.

    In truth, there's no magic to the number 50%--or 60%, or 67%, for that matter. As long as the number remains fixed, a responsive political class will generate the required majorities to meet the insistent demands of voters. The real problem occurs when changes in the rules outpace the political system's ability to react. With respect to control of the judiciary, that point was reached decades ago, when the American judicial custom of arbitrarily overruling the democratically accountable branches of government expanded from an occasional hobby into a full-time profession. The resulting disruption of the political equilibrium--reflected in the Bork and Thomas nomination fights, the Republican stalling tactics of the 1990s, and the recent filibusters--is still far from being resolved.

    Monday, May 09, 2005

    Economist/blogger Brad de Long has stirred up something of a controversy by lambasting an essay by German author Gunther Grass published on the occasion of the sixtieth anniversity of the defeat of Nazi Germany. Most commentators have understandably balked at de Long's characterization of Grass as "crypto-Nazi scum" (which de Long appears to have retracted). But the heated partisanship of the conflict (Grass is a lifelong leftist; de Long is of the center-left) has obscured some of the, shall we say, oddities of Grass' essay.

    To begin with, consider this passage:
    In the cold war that quickly followed, German states that had existed since 1949 consistently fell to one or other power bloc, whereupon the governments of both national entities sought to present themselves as model pupils of their respective dominating powers. Forty years later, during the glasnost period, it was in fact the Soviet Union that broke up the Democratic Republic, which had by that point become a burden. The Federal Republic's almost unconditional subservience to the United States was broken for the first time when the Social Democratic-Green ruling coalition decided to make use of the freedom given to us in sovereign terms 60 years ago, by refusing to allow German soldiers to participate in the Iraq war.
    It's true that lots of European intellectuals like to think of themselves as rebels against American hegemony. Still, Grass is writing on the occasion of the anniversary of the fall of the Nazi regime. Is that really the right time to bemoan the subsequent sixty years as a period of German subservience to foreign powers? What alternative fate, exactly, did Grass consider appropriate for Germany in 1945?

    There follows the following passage:
    Fifteen years after signing the treaty on unification, we can no longer conceal that despite the financial achievements, German unity has essentially been a failure. Petty calculation prevented the government of the time from submitting to the citizens of both states a new constitution relevant to the endeavors of Germany as a whole. It is therefore hardly surprising that people in the former East Germany should regard themselves as second-class Germans.
    Again, although one can scarcely fault Grass for worrying about geographic inequalities in modern Germany, is the anniversary of the fall of the Nazis really the right moment to present such problems as a failure of "German unity"?

    Grass continues:
    Now, I believe that our freely elected members of Parliament are no longer free to decide. The customary party pressures are not particularly present in Germany; it is, rather, the ring of lobbyists with their multifarious interests that constricts and influences the Federal Parliament and its democratically elected members, placing them under pressure and forcing them into disharmony, even when framing and deciding the content of laws. Consequently, Parliament is no longer sovereign in its decisions. It is steered by the banks and multinational corporations - which are not subject to any democratic control.
    Once again, it's common on the left to worry about corporate influence on the democratic process. But to use the anniversary of the end of German Nazism to echo old claims that the democratically elected German government is nothing but a collection of puppets of "banks and multinational corporations"? (At least Grass doesn't attribute any particular religion to the international capitalists who have supposedly hijacked German democracy. But still....)

    No, not every German nationalist socialist is a National Socialist. And Grass hasn't yet made the jump from the German far left to the de facto German far right (although he wouldn't be the first German artist to do so). But one might have expected a writer like Grass, who has made a career out of exploring the echoes of Nazism in postwar German culture, to be a bit more careful about keeping them out of his own pronouncements.

    Thursday, April 28, 2005

    If you find this blog nearly unreadable, there's now scientific (or at least scientific-sounding) support for your opinion. You can obtain it here.

    Wednesday, April 20, 2005

    Personally, I had neither any interest nor any opinion regarding the recent process that resulted in the selection of a new Pope. Nevertheless, I have to admit that when I heard the choice, I did have an emotional reaction, of sorts, albeit a rather crassly parochial one: Cardinal Ratzinger's elevation, I immediately thought, is obviously "good for the Jews".

    Just about any other candidate, elected Pope at this particular moment, would have plenty of reasons to make no end of trouble for Jews: a perceived need to appease virulently anti-Israel and anti-Semitic sentiment in Muslim countries, in the name of protecting vulnerable Catholic communities there; desire to participate in Third World international politics--which, these days, teems with anti-Zionism and anti-Semitism--in order to properly represent the huge population of Third World Catholics; or simple recognition of the many eternally sore points of theological and political friction between Catholics and Jews. However, Ratzinger has one very compelling reason not to make trouble: a desire to avoid seeming to the world to be a product of his unattractive past.

    Of course, that doesn't mean that he's likely to be much of a help to the world's Jews. But then, not too many sensible Jews ever look to a pope for help. And it's no small comfort that this one--unlike so many of his predecessors--might actually try to refrain from doing much harm.

    Saturday, April 16, 2005

    "Bankruptcy reform" is a lot like tax rates--it's an opportunity for plenty of partisan ranting on behalf of bedrock principles, but it's really about marginal adjustments and practical outcomes, not hard-and-fast absolutes.

    On the right, Todd Zywicki argues (and argues and argues and argues....) that the new bankruptcy reform bill will help reduce bankruptcy fraud and abuse, thus lowering interest rates for honest borrowers and protecting individual, small-scale and non-profit creditors from bankruptcy-abusers. On the left, Paul Krugman and Mark Kleiman essentially follow the "cui bono?" path, and conclude that the bankruptcy reform bill is all a plot by consumer creditors to increase their profits by winning the right to squeeze their helpless, impoverished debtors even harder than before. What's missing from both of these arguments is a clear picture of what bankruptcy is for, and why one might want to tighten or loosen its rules.

    Bankruptcy is simply a standardization of the act of defaulting on debts. When a debtor defaults on a debt, then the creditor can go to court to recover as much as possible of the debt from the debtor's remaining assets. When there are multiple creditors, though, deciding whose repayment gets which priority up to what amount becomes quite complicated. Bankruptcy is a way of resolving this complexity--in effect, the debtor's current assets are divided up among the creditors according to certain rules, and the debtor's debts are thereby ruled discharged.

    Of course, the devil's in the details. When can a debtor declare bankruptcy? Which of the debtor's assets are the creditors then allowed to divvy up? Which debts and obligations are thereby discharged? The answers to these questions can be more debtor-friendly--say, giving the debtor maximum flexibility in choosing when and how often to declare bankruptcy, requiring that only certain specific assets be seized, and specifying that all obligations are thereby fully discharged. Or they can be more creditor-friendly--say, severely limiting the debtor's option to declare bankruptcy, requiring that all present and future assets be prospectively seized, and only allowing a few debts to be thereby discharged. Where the laws stand on these questions thus determines a balance between debtors' and creditors' interests, which can be shifted in either direction at any time, for political or economic reasons. The latest "bankruptcy reform" bill, for instance, would shift the balance slightly further towards the creditors' interests in certain ways.

    The bill is a response to a recent significant rise in the rate of bankruptcies. The bill's supporters argue that "abuse" of the law is increasing, as bankruptcy becomes less of a cultural stigma, and that the resulting hesitancy on the part of lenders may reduce the availability of credit to "honest" borrowers. The bill's opponents respond that the rise in bankruptcy is a result of increased "sub-prime" lending--that is, lending to borrowers who were higher bankruptcy risks in the first place--and that creditors are simply trying to avoid having to pay the price for their reckless lending practices.

    Rather than attempt to assign blame for the rise in bankruptcies, it would be worthwhile to ask whether they're a problem in the first place. In fact, the rise in bankruptcies is a result of increased lending to risky borrowers--but the lenders weren't simply being foolish or reckless. Rather, their behavior is a perfectly sensible response to the financial revolution of the '80s and '90s--the same one that helped trigger today's housing and mortgage boom.

    In the last couple of decades, it has become legally and technically possible to "repackage" debt more flexibly than ever before. For example, mortgages were once issued by individual institutions, who stood to lose substantial amounts of money if more of their mortgages defaulted than they had expected--say, as a result of a local economic downturn. Today, however, mortgages can be "bundled" into "mortgage-backed securities"--bonds whose value is based on the combined future mortgage payments of many different borrowers. These bonds can then be sold off to multiple investors, spreading the risk of any one institution's mortgage portfolio over perhaps hundreds or thousands of institutions. As a result, any individual institution's risk is greatly reduced.

    Of course, since financial institutions are in the business of fielding controlled amounts of risk in exchange for the chance of a profit, their response has been not to reduce the risk of their portfolios, but rather to seek higher profits by jacking their risk back up to its previous level. The obvious way to do this is to lend to higher-risk borrowers, at higher interest rates, and then reduce their exposure to its previous level using the repackaging trick.

    Something very similar has happened in the consumer lending business. Those "sub-prime" lenders--credit card companies that sign up hordes of questionable credit risks--are engaged in exactly the same game as the mortgage issuers: they repackage their cusomers' future credit card payments as bonds, then sell them off to multiple buyers, spreading out the risk to the point where any one buyer's exposure is bearable even at high default rates--that is, at high bankruptcy rates. And since these higher-risk loans also carry higher returns--poor credit risks are always charged higher interest rates--these bearable-risk, high-return bonds are good business for everyone. That's why those millions of "pre-approved" credit card applications keep flowing through the mail, even as the bankruptcy rate increases.

    Under these circumstances, the argument for bankruptcy reform--that it's necessary to stem the rising tide of bankruptcies to keep credit within reach of higher-risk customers--is clearly nonsense. The increased availability of credit to less credit-worthy customers is driving the increase in bankruptcies, not being threatened by it. It's true that making bankruptcy more difficult and onerous for debtors would make credit even easier for high-risk borrowers to obtain. (So would the return of debtors' prison, for that matter: lenders would be confident of the willingness of even high-risk debtors to do everything possible in order to repay their debts and stay out of jail.) But the increase in the bankruptcy rate, far from being a harbinger of decreased credit availability, is actually a symptom of increased credit availability. Credit availability may or may not be at the "ideal" level today, but if you think we need more of it, then you should already be happy with the direction it's been going.

    The day bankruptcy rates drop, on the other hand--because creditors are too afraid to lend to all but the least risky borrowers--we might want to consider tightening bankruptcy laws, in order to boost lenders' confidence that they'll be repaid. That day may yet come--say, after some future economic downturn triggers a sharp rise in bankruptcies, panicking creditors into tightening their credit standards. (Indeed, I suspect a greater-than-expected jump in interest rates could well create such an outcome very soon.)

    However, that day is certainly not today. Creditors are hardly spooked by the current rising tide of bankruptcies--on the contrary, they fully expected it, have factored it carefully into their calculations, and are loving every minute of it.

    Tuesday, April 12, 2005

    Volokh co-conspirators Orin Kerr and Jim Lindgren have come out in favor of a proposal to limit the terms of US Supreme Court justices to 18 years. Co-conspirator Randy Barnett is also somewhat sympathetic.

    My response: why not 4-year terms for Supreme Court justices, commencing at each presidential inauguration? To paraphrase Shaw, we've determined what they are--now it's just the duration we're bargaining over.

    The justifications for 18-year terms--that they might make the Supreme Court more "modest" and responsive to public opinion, and presidents less inclined to appoint young, inexperienced justices, in the hope of influencing the Court for 50 years--effectively concede the point that the justices' role has long ceased to be anything even resembling neutral, dispassionate application of the Constitution and federal statutes. It is apparently now widely acknowledged that candidates are nominated by politicians for the sole purpose of enshrining particular political viewpoints--even constituencies' interests--in Constitutional and statute interpretation. Given, then, that the justices' role is a de facto political one, what's the argument for not making them every bit as accountable as any other political actor?

    Of course, once they're political actors, it's hard to see what benefits they provide that aren't already covered by the other democratically accountable branches of government. Then again, perhaps if they had simply stuck to being judges in the first place, and hadn't succumbed to the temptation to abuse their powers for nakedly political ends, then they might not seem so utterly superfluous now.

    Sunday, April 10, 2005

    Just for fun, here are some music trivia questions:

    1. What song, written in 1968 by film composer Piero Umiliani for a soft-core documentary on sex in Sweden, went on to become a worldwide hit, recorded by numerous artists--including the Muppets?

    (You can learn the answer using this search query. Or you can listen to a sample of the original recording here.)

    2. What 1970 soft pop hit by Roger Nichols and Paul Williams was originally written for a television commercial for the Crocker Bank?

    (You can learn the answer using this search query.)

    3. What well-known "folk song" was actually written in 1940 for the musical "Esterke", by the legendary New York Yiddish theater composer Shalom Secunda (who also wrote the Andrews Sisters' hit "Bei Mir Bist Du Shein"), with lyrics by Aaron Zeitlin that allegorically mock the doomed Jews of Europe for failing to escape to America?

    (You can learn the answer using this search query.)