Sunday, September 29, 2002

While American opinion-makers focus on Iraq, a radical change in the structure of US politics is quietly taking place in the Senate. That body's power to "advise and consent" regarding nominees to the federal bench has long been used as a tool with which to exert influence on the long-term political direction of the judiciary. But in the past, to admit to explicit political criteria for evaluating judges was something of a taboo. That changed in 1987, with the hearings on the Supreme Court nomination of Judge Robert Bork, during which senators openly proclaimed Bork's judicial views to be "outside the mainstream", i.e., politically unacceptable. The rejection of Judge Bork on political grounds caused a subsequent mild backlash against blatantly partisan evaluation of judges, and for a while tactics such as personal scandal-mongering (culminating in the Anita Hill travesty) and procedural foot-dragging have been deployed as public cover for partisan conflicts over nominees. But with the latest Senate rejections of Bush appointees Charles Pickering and Priscilla Owen, and contentious hearings over recent appointee Miguel Estrada, senators are now quite openly admitting that their votes on nominees are political judgments based on the nominees' likely rulings on key public policy issues.

Now, given the massive aggregation of power by the US federal judiciary over the past fifty years, it is not surprising that the elected branches of government would attempt to increase their influence over the exercise of that power, by using the only practical tool available--the judicial nomination process. (I predict, in the future, a sudden upsurge in interest in judicial impeachments, as well, if one party or ideological tendency should manage to reach a two-thirds majority in the Senate.) What the Senators may not have considered, however, is that their politicization of the confirmation process could easily backfire: a judge vetted for political acceptability and endorsed by the Senate may subsequently consider him- or herself--and be considered by others--to have earned a mandate for even broader exercises of political power than today's judges normally consider legitimate. The result would be the worst of all possible worlds: government by appointed tribunes with the trappings of political legitimacy to justify their diktats, no democratic accountability, and no checks on their absolute power to "interpret" the law (not to mention the constitution) in ways that explicitly overrule the decisions of the elected branches of government.

Stephen Carter, in his 1994 book, "The confirmation Mess", suggested that, given that Supreme Court justices are inevitably going to shape public policy at least as much as any elected official, they should perhaps be elected officials themselves. The idea seemed ludicrous to me when I first read it; after all, the US government already contains two elected branches quite capable of expressing the popular will through legislation and executive action. If the third branch is making government too undemocratic, surely it would be far simpler just to weaken its power to thwart the other two branches, by returning its mandate to its original scope--applying the law as written by others. Of course, I was rather naive back then, and certain that Americans would never permit their democratically elected leaders to yield their command meekly to an appointed quasi-junta of their own making.

By now, I know better.

Monday, September 23, 2002

A recent incident in the world of journalism has shed some embarrassing light on its workings. Christopher Newton, an AP reporter, was fired last Monday for "quoting" fabricated "experts" in his stories--more than a dozen, over a period of years. He's hardly the first, of course; numerous cases, such as those of Stephen Glass and Patricia Smith, have been publicized over the last few years. But in the past, such journalists have typically been exposed after inventing sham encounters with anonymous or semi-anonymous "average folks"; Glass, in fact, was caught as soon as he dared invent an actual company, rather than a mere random individual. Newton, on the other hand, spent years occasionally presenting fictitious characters as official academics at real universities or fake "institutes"; yet he was only nabbed this month. What happened?

The answer lies in an important shift that has taken place in the field of journalism over the last few decades. With the advent of modern communications technology, basic news has essentially become a commodity, generated by poorly-paid stringers, supplied cheaply in bulk by wire services, and packaged and distributed at little cost (or profit) by media companies. The job of the elite professional journalist has thus changed drastically; it now consists of creating audience-grabbing (not informative or accurate) content that can hold the increasingly peripatetic attention of some large cohort of customers, and keep them coming back for more.

This form of journalism is much more synthetic, in all senses of the word, than the traditional variety; rather than gather information, organize it, and pass it along, the contemporary journalist develops a "story", and then goes in search of the necessary facts and quotations with which to construct a factual basis for it. In a story dealing with public opinion, for example, ordinary people must be found (or invented) to provide the quotations that support the story's claim. Similarly, for stories about more substantial topics, experts are needed, to lend credibility to the journalist's thesis.

And the experts have stepped up, in droves, to fill that need. In exchange for the fame that helps them procure money and status, entire organizations and faculties of academics and self-styled pundits have arisen to supply story-writers with the raw material they're looking for: brisk, easily-packaged quotations from plausible-sounding experts, reliably expressing some standard category of view that a "reporter" might like to drop into a story to make it look as though its content was gathered rather than composed.

That's why Christopher Newton's perfidy remained secret for so long: his invented experts, expressing unsurprising opinions that concisely and effectively buttressed his assertions, were disturbingly indistinguishable from the real thing. Far from betraying the very foundations of his craft, he was in fact merely skipping a redundant step in the process of creating modern journalistic copy.

In doing so, of course, he exposed the essential fraudulence of his colleagues' purportedly more "honest" work. And for that sin, he had to be destroyed.

Friday, September 13, 2002

In Slate, three distinguished foreign affairs scholars from the Brookings Institution have just announced that they have achieved consensus on a "counterintuitive conclusion" regarding the correct American policy towards Iraq. The US, say the trio of wise men, "should present Saddam with a serious, final ultimatum for toughened up inspections and real disarmament and go to war if he refuses it or subsequently fails to cooperate with the inspectors."

Now, I'm no foreign policy guru from a prestigious think-tank; I'm just an ordinary guy commenting on the news. So I'm sure these three mavens could immediately point out where my thinking is woefully deficient. But I have this nagging worry in the back of my mind about a possible scenario that might conceivably make their prescription somewhat less than ideal. The scenario goes something like this: Hussein first agrees to the terms of the ultimatum. The inspectors get organized, and begin to do their work. They find some research facilities, some production equipment, some weaponry, and set about destroying it.

The US military can't stay on an Iraqi-war footing forever, though, so the massive military mobilization that has been proceeding for the past year or so eventually begins to reverse itself, and domestic and international political discussion ultimately turn to other matters. At that point, Hussein begins, little by little, to interfere with the inspection process. He thus puts the US in a bind; it was ready to go to war to establish the inspection regime in the first place, but would it be ready to remobilize and attack over, say, a short delay in allowing inspectors to visit a particular site? Then a slightly longer delay than the previous one?

Once this process has started, it can continue until, eventually, arms inspectors have been completely banned from Iraq. At that point, the world community would likely be clamoring more for an end to sanctions against Iraq than for a return of inspections, let alone a war to oust Hussein. And the US president--quite possibly one not named Bush--would be under strong domestic pressure to avoid a major confrontation, and content himself with a token response to Iraq's disobedience. Meanwhile, the Iraqi non-conventional weapons development program would be free to resume at full speed.

Again, I claim no expertise, and I'm sure the Brookings experts would have no trouble explaining to me why this outcome of their proposed course of action is in fact completely implausible, and could never, ever happen.

Sunday, September 08, 2002

In the Wall Street Journal, James Bowman reminisces about the old Oxford-Cambridge entrance examinations, which used to include (until some time in the 1980's) a three-hour essay-writing session requiring the applicant to select from a list of topics such as, "is popular culture a contradiction in terms?", or "is there any point considering what might have happened?", or "why should promises be kept?". According to Bowman, the aim was to "tempt candidates by wording questions as invitations merely to regurgitate their prejudices--and then reward those who didn't, who could look at a question from both sides, with all the argumentative logic and imaginative sympathy that such an exercise may require."

Ah, if only we could institute a similar admission exam for bloggers....
Postng to his extremely thoughtful, interesting new blog, UCLA professor Mark Kleiman formulates, to my surprise, what he refers to as the "one-free-bite rule" of international relations: while "preventive" warfare to topple a dangerous-looking regime is unjust, a government (he has Iraq in mind) that has already engaged in sufficiently aggressive behavior (invading a neighbor, for instance) forfeits its moral immunity from preventive attack. I mention my surprise becase I have been advocating for several years my own slightly different, more "realist" (more cynical, some might say) version of the "one-free-bite rule". As I see it, a run-of-the-mill nasty dictatorship can pretty much be expected to engage in "useful hostilities" with at least one of its neighbors, so as to distract the masses, justify its own iron grip on power, and keep its military too preoccupied to hang around the capital planning coups d'etat. But a regime that attacks a second neighbor--and particularly one that scores successive victories in each--can be assumed to have ambitions beyond its station, and thus to require removal for the sake of regional stability.

To use my original example, a Slobodan Milosevic "merely" ravaging Bosnia could be (and probably was, by many) considered to be an ordinary thug exploiting nationalism to shore up his internal support, and hence only offending the world's conscience (however appallingly). By following up with the strongarming of Kosovo, on the other hand, he made himself look more like a dangerous megalomaniac who would keep using his army until stopped by vigorous application of NATO military force.

Likewise, Saddam Hussein was tacitly indulged while he conducted a staggeringly brutal eight-year war of aggression against Iran; by subsequently attacking Kuwait, however, he demonstrated a sufficiently ravenous appetite for armed aggression that he became a clear threat to regional stability, whose removal had become (pace the US administration of the time--and several of its retired alumni today) a necessity. The later discovery of his huge unconventional weapons programs only reinforced the case for eliminating him--a case which was already compelling in 1990, and continues to persuade to this day (subject to the concerns I discussed previously, of course).

Thursday, September 05, 2002

Two recent articles in Ha'aretz inadvertently illustrate my previous point about the tremendous change in Israel brought about by Operation Defensive Shield (also noticed by Israel Harel, who--incorrectly, as I will demonstrate--emphasizes its effect on Palestinians, not on Israelis). One, by Danny Rubinstein on the state of affairs in the occupied territories, alludes cryptically at the end to a "new deterioration" on the "security front", due to the heightened suffering of the Palestinians under de facto Israeli reoccupation. Ze'ev Schiff, a widely respected military analyst, likewise alludes vaguely to predicted "acts of revenge" if the army is not more careful about preventing inadvertent Palestinian civilian casualties in its anti-terror operations.

It's only in the context of the past two years' political discourse in Israel that these articles' veiled implications can be properly understood. During the first eighteen months of the current conflict, the argument that harsh Israeli anti-terror measures would only provoke harsher terrorist attacks in retaliation was a staple of the left's campaign to reopen political negotiations. The argument disappeared completely from view in the aftermath of Operation Defensive shield in March; the brutal campaign of bombings immediately before the operation, and the drastic decline in attacks immediately following, provided ample empirical proof of its speciousness. These two new articles by Rubinstein and Schiff mark the first signs I have seen of the argument's soft, hesitant return to speakability.

These two journalists would do well to bear in mind the most recent poll of Palestinian opinion in the territories, in which a majority (52 percent, to be exact) of those surveyed expressed support for terrorist attacks against civilians inside Israel. While seemingly high, this number is in fact identical to the figure from July 2000, before the current armed conflict--that is, before any Israeli anti-terror measures in PA areas--even began. It also represents a modest decline from the 58 percent support rate measured in December, 2001--after the start of hostilities, but before the full-scale re-entrance of the IDF into the PA-controlled portions of the territories during Operation Defensive Shield.

In other words, there has never been a shortage of enthusiasm among Palestinians for anti-Israel terrorism--irrespective of Israel's anti-terror military actions--and any decline in terror attacks can thus be credited to the army's vigorous counterterrorist activity, not to any imagined conciliatory sentiment on the part of the legion of supporters of Hamas, Islamic Jihad, or the Al Aqsa Martyrs' Brigades. The very idea is as ludicrous now as it was two months ago, when Rubinstein and Schiff would not have dared embarrass themselves by suggesting it.

Tuesday, September 03, 2002

It should be an optimistic story, really: in a remote, arid, poverty-stricken frontier region where the traditional livestock-herding lifestyle is no longer economically viable, modern agricultural techniques make it possible for the natives to sell their land to farmers for productive use and move on to literally greener pastures elsewhere, or to the more prosperous cities for a taste of modern life. The American journalist sees it differently, though; he bemoans the declining population (!) in a place "so rich in warmth, community spirit, and old-fashioned friendliness"--despite the openly professed eagerness of the locals to leave their godforsaken backwater and begin anew in a more hospitable environment. He decries the favoring of "crop farmers" over "livestock owners". (When, I wonder, did a Times writer last take the side of the meat industry against vegetarian farming?) And he applauds schemes to keep the old ways alive--such as a recently-opened resort where tourists on the lookout for fresh exotica can experience the picturesque local practices first-hand.

Not that it's the slightest bit unusual or shocking, of course, to find a journalist waxing condescendingly sentimental about the charms of a quaint, out-of-the-way culture and habitat that he personally would find utterly intolerable to live in. This time, however, the writer is New York Times columnist Nick Kristof, and the vanishing tribe consists of....ranchers in rural Nebraska.

Now, given the absence of language or citizenship barriers and the power of modern communications technology, there's absolutely nothing--not even the need to find a suitable new job--that keeps Mr. Kristof himself from ditching Manhattan and relocating to this threatened paradise, if he considers it so worth saving. My guess, though, is that he would never in a million years consider living amongst a bunch of small-town cowpoke yahoos desperate to escape their miserably desolate patch of prairie emptiness. No, he'd surely much rather plead tearfully for the preservation of their precious heritage from the comfortable bustle of his big-city newsroom.