Sunday, September 29, 2002

While American opinion-makers focus on Iraq, a radical change in the structure of US politics is quietly taking place in the Senate. That body's power to "advise and consent" regarding nominees to the federal bench has long been used as a tool with which to exert influence on the long-term political direction of the judiciary. But in the past, to admit to explicit political criteria for evaluating judges was something of a taboo. That changed in 1987, with the hearings on the Supreme Court nomination of Judge Robert Bork, during which senators openly proclaimed Bork's judicial views to be "outside the mainstream", i.e., politically unacceptable. The rejection of Judge Bork on political grounds caused a subsequent mild backlash against blatantly partisan evaluation of judges, and for a while tactics such as personal scandal-mongering (culminating in the Anita Hill travesty) and procedural foot-dragging have been deployed as public cover for partisan conflicts over nominees. But with the latest Senate rejections of Bush appointees Charles Pickering and Priscilla Owen, and contentious hearings over recent appointee Miguel Estrada, senators are now quite openly admitting that their votes on nominees are political judgments based on the nominees' likely rulings on key public policy issues.

Now, given the massive aggregation of power by the US federal judiciary over the past fifty years, it is not surprising that the elected branches of government would attempt to increase their influence over the exercise of that power, by using the only practical tool available--the judicial nomination process. (I predict, in the future, a sudden upsurge in interest in judicial impeachments, as well, if one party or ideological tendency should manage to reach a two-thirds majority in the Senate.) What the Senators may not have considered, however, is that their politicization of the confirmation process could easily backfire: a judge vetted for political acceptability and endorsed by the Senate may subsequently consider him- or herself--and be considered by others--to have earned a mandate for even broader exercises of political power than today's judges normally consider legitimate. The result would be the worst of all possible worlds: government by appointed tribunes with the trappings of political legitimacy to justify their diktats, no democratic accountability, and no checks on their absolute power to "interpret" the law (not to mention the constitution) in ways that explicitly overrule the decisions of the elected branches of government.

Stephen Carter, in his 1994 book, "The confirmation Mess", suggested that, given that Supreme Court justices are inevitably going to shape public policy at least as much as any elected official, they should perhaps be elected officials themselves. The idea seemed ludicrous to me when I first read it; after all, the US government already contains two elected branches quite capable of expressing the popular will through legislation and executive action. If the third branch is making government too undemocratic, surely it would be far simpler just to weaken its power to thwart the other two branches, by returning its mandate to its original scope--applying the law as written by others. Of course, I was rather naive back then, and certain that Americans would never permit their democratically elected leaders to yield their command meekly to an appointed quasi-junta of their own making.

By now, I know better.

Monday, September 23, 2002

A recent incident in the world of journalism has shed some embarrassing light on its workings. Christopher Newton, an AP reporter, was fired last Monday for "quoting" fabricated "experts" in his stories--more than a dozen, over a period of years. He's hardly the first, of course; numerous cases, such as those of Stephen Glass and Patricia Smith, have been publicized over the last few years. But in the past, such journalists have typically been exposed after inventing sham encounters with anonymous or semi-anonymous "average folks"; Glass, in fact, was caught as soon as he dared invent an actual company, rather than a mere random individual. Newton, on the other hand, spent years occasionally presenting fictitious characters as official academics at real universities or fake "institutes"; yet he was only nabbed this month. What happened?

The answer lies in an important shift that has taken place in the field of journalism over the last few decades. With the advent of modern communications technology, basic news has essentially become a commodity, generated by poorly-paid stringers, supplied cheaply in bulk by wire services, and packaged and distributed at little cost (or profit) by media companies. The job of the elite professional journalist has thus changed drastically; it now consists of creating audience-grabbing (not informative or accurate) content that can hold the increasingly peripatetic attention of some large cohort of customers, and keep them coming back for more.

This form of journalism is much more synthetic, in all senses of the word, than the traditional variety; rather than gather information, organize it, and pass it along, the contemporary journalist develops a "story", and then goes in search of the necessary facts and quotations with which to construct a factual basis for it. In a story dealing with public opinion, for example, ordinary people must be found (or invented) to provide the quotations that support the story's claim. Similarly, for stories about more substantial topics, experts are needed, to lend credibility to the journalist's thesis.

And the experts have stepped up, in droves, to fill that need. In exchange for the fame that helps them procure money and status, entire organizations and faculties of academics and self-styled pundits have arisen to supply story-writers with the raw material they're looking for: brisk, easily-packaged quotations from plausible-sounding experts, reliably expressing some standard category of view that a "reporter" might like to drop into a story to make it look as though its content was gathered rather than composed.

That's why Christopher Newton's perfidy remained secret for so long: his invented experts, expressing unsurprising opinions that concisely and effectively buttressed his assertions, were disturbingly indistinguishable from the real thing. Far from betraying the very foundations of his craft, he was in fact merely skipping a redundant step in the process of creating modern journalistic copy.

In doing so, of course, he exposed the essential fraudulence of his colleagues' purportedly more "honest" work. And for that sin, he had to be destroyed.

Friday, September 13, 2002

In Slate, three distinguished foreign affairs scholars from the Brookings Institution have just announced that they have achieved consensus on a "counterintuitive conclusion" regarding the correct American policy towards Iraq. The US, say the trio of wise men, "should present Saddam with a serious, final ultimatum for toughened up inspections and real disarmament and go to war if he refuses it or subsequently fails to cooperate with the inspectors."

Now, I'm no foreign policy guru from a prestigious think-tank; I'm just an ordinary guy commenting on the news. So I'm sure these three mavens could immediately point out where my thinking is woefully deficient. But I have this nagging worry in the back of my mind about a possible scenario that might conceivably make their prescription somewhat less than ideal. The scenario goes something like this: Hussein first agrees to the terms of the ultimatum. The inspectors get organized, and begin to do their work. They find some research facilities, some production equipment, some weaponry, and set about destroying it.

The US military can't stay on an Iraqi-war footing forever, though, so the massive military mobilization that has been proceeding for the past year or so eventually begins to reverse itself, and domestic and international political discussion ultimately turn to other matters. At that point, Hussein begins, little by little, to interfere with the inspection process. He thus puts the US in a bind; it was ready to go to war to establish the inspection regime in the first place, but would it be ready to remobilize and attack over, say, a short delay in allowing inspectors to visit a particular site? Then a slightly longer delay than the previous one?

Once this process has started, it can continue until, eventually, arms inspectors have been completely banned from Iraq. At that point, the world community would likely be clamoring more for an end to sanctions against Iraq than for a return of inspections, let alone a war to oust Hussein. And the US president--quite possibly one not named Bush--would be under strong domestic pressure to avoid a major confrontation, and content himself with a token response to Iraq's disobedience. Meanwhile, the Iraqi non-conventional weapons development program would be free to resume at full speed.

Again, I claim no expertise, and I'm sure the Brookings experts would have no trouble explaining to me why this outcome of their proposed course of action is in fact completely implausible, and could never, ever happen.

Sunday, September 08, 2002

In the Wall Street Journal, James Bowman reminisces about the old Oxford-Cambridge entrance examinations, which used to include (until some time in the 1980's) a three-hour essay-writing session requiring the applicant to select from a list of topics such as, "is popular culture a contradiction in terms?", or "is there any point considering what might have happened?", or "why should promises be kept?". According to Bowman, the aim was to "tempt candidates by wording questions as invitations merely to regurgitate their prejudices--and then reward those who didn't, who could look at a question from both sides, with all the argumentative logic and imaginative sympathy that such an exercise may require."

Ah, if only we could institute a similar admission exam for bloggers....
Postng to his extremely thoughtful, interesting new blog, UCLA professor Mark Kleiman formulates, to my surprise, what he refers to as the "one-free-bite rule" of international relations: while "preventive" warfare to topple a dangerous-looking regime is unjust, a government (he has Iraq in mind) that has already engaged in sufficiently aggressive behavior (invading a neighbor, for instance) forfeits its moral immunity from preventive attack. I mention my surprise becase I have been advocating for several years my own slightly different, more "realist" (more cynical, some might say) version of the "one-free-bite rule". As I see it, a run-of-the-mill nasty dictatorship can pretty much be expected to engage in "useful hostilities" with at least one of its neighbors, so as to distract the masses, justify its own iron grip on power, and keep its military too preoccupied to hang around the capital planning coups d'etat. But a regime that attacks a second neighbor--and particularly one that scores successive victories in each--can be assumed to have ambitions beyond its station, and thus to require removal for the sake of regional stability.

To use my original example, a Slobodan Milosevic "merely" ravaging Bosnia could be (and probably was, by many) considered to be an ordinary thug exploiting nationalism to shore up his internal support, and hence only offending the world's conscience (however appallingly). By following up with the strongarming of Kosovo, on the other hand, he made himself look more like a dangerous megalomaniac who would keep using his army until stopped by vigorous application of NATO military force.

Likewise, Saddam Hussein was tacitly indulged while he conducted a staggeringly brutal eight-year war of aggression against Iran; by subsequently attacking Kuwait, however, he demonstrated a sufficiently ravenous appetite for armed aggression that he became a clear threat to regional stability, whose removal had become (pace the US administration of the time--and several of its retired alumni today) a necessity. The later discovery of his huge unconventional weapons programs only reinforced the case for eliminating him--a case which was already compelling in 1990, and continues to persuade to this day (subject to the concerns I discussed previously, of course).

Thursday, September 05, 2002

Two recent articles in Ha'aretz inadvertently illustrate my previous point about the tremendous change in Israel brought about by Operation Defensive Shield (also noticed by Israel Harel, who--incorrectly, as I will demonstrate--emphasizes its effect on Palestinians, not on Israelis). One, by Danny Rubinstein on the state of affairs in the occupied territories, alludes cryptically at the end to a "new deterioration" on the "security front", due to the heightened suffering of the Palestinians under de facto Israeli reoccupation. Ze'ev Schiff, a widely respected military analyst, likewise alludes vaguely to predicted "acts of revenge" if the army is not more careful about preventing inadvertent Palestinian civilian casualties in its anti-terror operations.

It's only in the context of the past two years' political discourse in Israel that these articles' veiled implications can be properly understood. During the first eighteen months of the current conflict, the argument that harsh Israeli anti-terror measures would only provoke harsher terrorist attacks in retaliation was a staple of the left's campaign to reopen political negotiations. The argument disappeared completely from view in the aftermath of Operation Defensive shield in March; the brutal campaign of bombings immediately before the operation, and the drastic decline in attacks immediately following, provided ample empirical proof of its speciousness. These two new articles by Rubinstein and Schiff mark the first signs I have seen of the argument's soft, hesitant return to speakability.

These two journalists would do well to bear in mind the most recent poll of Palestinian opinion in the territories, in which a majority (52 percent, to be exact) of those surveyed expressed support for terrorist attacks against civilians inside Israel. While seemingly high, this number is in fact identical to the figure from July 2000, before the current armed conflict--that is, before any Israeli anti-terror measures in PA areas--even began. It also represents a modest decline from the 58 percent support rate measured in December, 2001--after the start of hostilities, but before the full-scale re-entrance of the IDF into the PA-controlled portions of the territories during Operation Defensive Shield.

In other words, there has never been a shortage of enthusiasm among Palestinians for anti-Israel terrorism--irrespective of Israel's anti-terror military actions--and any decline in terror attacks can thus be credited to the army's vigorous counterterrorist activity, not to any imagined conciliatory sentiment on the part of the legion of supporters of Hamas, Islamic Jihad, or the Al Aqsa Martyrs' Brigades. The very idea is as ludicrous now as it was two months ago, when Rubinstein and Schiff would not have dared embarrass themselves by suggesting it.

Tuesday, September 03, 2002

It should be an optimistic story, really: in a remote, arid, poverty-stricken frontier region where the traditional livestock-herding lifestyle is no longer economically viable, modern agricultural techniques make it possible for the natives to sell their land to farmers for productive use and move on to literally greener pastures elsewhere, or to the more prosperous cities for a taste of modern life. The American journalist sees it differently, though; he bemoans the declining population (!) in a place "so rich in warmth, community spirit, and old-fashioned friendliness"--despite the openly professed eagerness of the locals to leave their godforsaken backwater and begin anew in a more hospitable environment. He decries the favoring of "crop farmers" over "livestock owners". (When, I wonder, did a Times writer last take the side of the meat industry against vegetarian farming?) And he applauds schemes to keep the old ways alive--such as a recently-opened resort where tourists on the lookout for fresh exotica can experience the picturesque local practices first-hand.

Not that it's the slightest bit unusual or shocking, of course, to find a journalist waxing condescendingly sentimental about the charms of a quaint, out-of-the-way culture and habitat that he personally would find utterly intolerable to live in. This time, however, the writer is New York Times columnist Nick Kristof, and the vanishing tribe consists of....ranchers in rural Nebraska.

Now, given the absence of language or citizenship barriers and the power of modern communications technology, there's absolutely nothing--not even the need to find a suitable new job--that keeps Mr. Kristof himself from ditching Manhattan and relocating to this threatened paradise, if he considers it so worth saving. My guess, though, is that he would never in a million years consider living amongst a bunch of small-town cowpoke yahoos desperate to escape their miserably desolate patch of prairie emptiness. No, he'd surely much rather plead tearfully for the preservation of their precious heritage from the comfortable bustle of his big-city newsroom.

Saturday, August 31, 2002

Most commentators doubt that the recent settlement of Major League Baseball's labor dispute will ultimately solve baseball's competitiveness problems; the consensus is that "small-market" teams with relatively meager fan bases and lower revenue prospects will continue to have great difficulty competing for superstars against their "large-market" counterparts with bulging pocketbooks to spend on stratospheric player contracts. But for some reason, none of these analysts seems able to think beyond a few minor variations on the set of gimmicks already contained in the new agreement--that is, revenue sharing and limitations on overspending by wealthy teams. This narrow-mindedness betrays a fundamentally shallow view of the problems besetting the game.

Contrary to popular belief, large-market sports teams are not always dominating powerhouses that make mincemeat of their smaller, poorer opponents. After all, a large market is also often a reliably lucrative one--that is, one that can generate a steady stream of revenue regardless of how badly its team stinks up the joint. (Think, for example, of the Chicago Cubs.) Moreover, even small market teams can play well below their potential when their owners decide to put parsimony before performance (as any number of small-market teams demonstrate consistently, year after year.)

The problem, in short, is not revenue disparity among teams, but rather the lack of a connection between team revenue and team performance. When an owner can make decisions that are in his or her obvious financial interest but bad for the team's winning potential, fans are understandably upset--just as they would be if, say, an individual player took a personal payoff to throw a game. What is needed, therefore, is not a system that artificially evens out team incomes--thus further insulating owners from the effects of their undermining of the quality of their own teams--but rather a system that financially rewards good, skillful, competitive team management, and punishes its opposite.

There's an obvious choice for such a system: pool the revenues of all the clubs, and redistribute them based strictly on team performance. That's how most sports work, after all--whatever you spend, your income depends (almost) exclusively on the "prize money" awarded you for your competitive successes. In the case of baseball, each team's revenues for the year would simply depend on its winning percentage, according to some predetermined formula. Labor strife would likely disappear as a result, since the "value" of a player's services would be directly measurable (or at least estimable) in terms of that player's effect on team performance (and hence on team earnings). Mediocre owners and general managers would also no longer be able to use the economic limitations of their markets as an excuse; they would instead be directly comparable as team-builders, on the basis of their ability to invest shrewdly in "undervalued" players and gain maximum return on their farm prospects. And a dominating team, however wealthy, could always be toppled by a shrewd investor able to build a better team--and thus earn greater returns--on the same invested capital.

This is a radical proposal, of course--probably too radical to be considered seriously by any of the major sports leagues. We can therefore predict with confidence that the problem teams of professional sports--not profligate winners like baseball's Yankees or Diamondbacks, or football's Dolphins or Forty-Niners, but rather fan-ripoff scams like baseball's Cubs or football's Bengals--will continue to eat away at the integrity and popularity of their respective leagues from within for many years to come.
I've just returned from about a week in Israel, most of which I spent listening to a variety of speakers--politicians, journalists, academics and others--lecturing and taking questions about current conditions there. (Believe it or not, that's my idea of the perfect vacation.) I would summarize my main conclusions as follows:

  • There is emerging an overwhelming centrist consensus in Israeli politics. The right has for some time now (with the exception of a tiny fringe) given up on the idea of controlling the occupied territories ("Judea, Samaria and Gaza") indefinitely, let alone annexing them. Meanwhile, the last two years of violence have driven the left (again, with the exception of a tiny fringe) to abandon as suicidally unworkable any kind of unilateral withdrawal from those same territories. The consensus, therefore, is that Israel will be happy to accept a Palestinian state--but only within the framework of a sincerely reciprocal, practically enforceable peace agreement that puts an end to the conflict once and for all and offers a solid expectation that organized Palestinian violence against Israel will genuinely, permanently cease. Sadly, that prospect seems distant at the moment.


  • Regarding the current two-year-old campaign of terror, the solid majority view is that while it is likely to continue at a low level for quite some time, Israel is now, for all intents and purposes, and barring a major new development, the undeclared winning side in the conflict. Two years ago a divided Israel, desperate for peace, was unilaterally offering an all-but-sovereign Palestinian Authority full, unfettered statehood on virtually all the territories, ceding control over much of Jerusalem, and even considering allowing for a limited "return" of Palestinians to Israel proper. Today, the Palestinian economy and polity are in utter disarray, Israeli troops run unimpeded through the West Bank's largest cities, Yasser Arafat is an international pariah, and nobody (outside the fantasyland of international diplomacy) is expecting Palestinian Independence Day to arrive anytime soon. By adhering doggedly to their twin pillars of terrorism and maximalism, the Palestinians have once again, in Israeli eyes, missed an historic opportunity, and are now paying the terrible price.

    (I would add as an aside here--and this is not part of the reigning consensus--that the turning point in the conflict was "Operation Defensive Shield", in which Israeli troops invaded the areas controlled by the Palestinian Authority and began dismantling the "terrorist infrastructure", thus proving the absurdity of the left's mantra that "there is no military solution" to the problem of Palestinian terrorism. It is widely believed, in Israel and elsewhere, that Israeli prime minister Ariel Sharon acted with wise and shrewd restraint during his year or so in office preceding the military operation, waiting until support among the Israeli public and in the White House for military action reached a level that made it politically feasible. In my view, Sharon simply waited a year too long, and the ultimate success of the operation in vastly reducing the rate of terrorist attacks actually generated support for the military approach, rather than relying on it. Had Sharon acted sooner, the consensus I described might well have been built right away, and hundreds of lives might have been saved in the interim.)


  • Although terrorism is obviously still a top priority, the current lull in the violence, and the emergence of the aforementioned two-pronged consensus, have shifted the attention of the Israeli populace somewhat towards other matters--most notably, the rapidly imploding Israeli economy. The collapse of the high-tech bubble has coincided with the sudden absence of war-shy foreign tourists, and the consequences of these roughly simultaneous shocks have been devastating, with businesses closing en masse and laying off workers left and right. The economic crisis, many argue, presents both a danger and an opportunity: while the increasingly straitened circumstances may exacerbate internal political and social tensions, they may also provide the impetus to resolve simmering conflicts over resource allocation and force needed reforms in areas such as welfare, sectoral subsidy and military burden-sharing. Meanwhile, the long-term problems--the demographic time bomb of Israel's Arab population, and the inevitable, devastating water shortage--are likely to go unaddressed for some time to come.


I expect I will elaborate further on these ideas in future postings, but for now I will let this summary suffice.

Thursday, August 22, 2002

I'll be taking a roughly-10-day break from blogging. Details when I resume.....

Sunday, August 18, 2002

Dahlia Lithwick's "Letter to a Young Law Student" in Slate is interesting less for the advice she gives--the usual "don't take it so seriously, stop and smell the roses, you'll look back on this years from now and be as cloyingly sentimental about it all as I'm being right now" blather--than for her characterization of the typical modern law student. According to Lithwick, the vast majority of them "applied to law school simply because [they] took the LSATs", and "took the LSATs simply because the MCATs were too hard". They "graduated college with the generalized sense that they ought to be doing good works on this planet but were uncertain how to go about it", and "went to law school hoping that the experience would be stimulating and/or mind-expanding; a liberal-arts grad school for political people."

Well, we saw what a couple of generations of that sort of aimless, unserious student did for liberal arts undergraduate education: it now consists mostly of ignorant, lazy, illiterate slackers pausing between parties to collect their gift A's in a collection of joke courses in vacuous subspecialties, earning worthless degrees that signify with high probability that the bearer possesses no discernible job, life or intellectual skills. Certainly there aren't many liberal arts graduates these days who feel the need to admonish this year's freshmen, as Lithwick does the incoming "one-L" class, to "ignore your grades" and "have a life".

How long, then, will the law schools be able to continue to absorb a stream of applicants who are uninterested in law, hard work or career advancement, before the pressure on them to cater to student demand for "stimulating and/or mind-expanding" experiences forces them to eviscerate their own curricula and turn themselves into summer camps for spoiled post-undergraduate "permafrosh"? Can the law schools possibly have the spine to stand their ground and force all those future proto-Lithwicks to find their three-year vacations elsewhere, should they rise up one day soon and, like their undergraduate counterparts, start demanding "relevant" courses, "fair" grading, and all the rest?
Steven Levy's Newsweek article on blogging tells the depressing story of 18-year-old Zack, "an insecure kid who clowned around in high school and felt that no one really liked him." So he started a blog, and now he has "28 readers a day".

Why is this story depressing, you ask? Well, I haven't been getting anywhere near 28 hits a day--even when you include the occasional random Google search for strings like (just to give a few examples from my hit counter's referrer log) "marijuana infrared scanning", "mossad interrogation", or "paternal filicide".

Regular ICBW readers (I know you're out there--both of you!) are encouraged to let me know by email whether they think I should follow Zack's lead, cut down on the public affairs commentary, and use this blog more to "express [my] feelings and share [my] songs, poems and artwork". Hey, the kid must be doing something right....

Saturday, August 17, 2002

Richard Cohen has said it. Hendrik Hertzberg, too. Cokie and Steve Roberts have said it together. E.J. Dionne has (more or less) said it. Michael Kinsley was among the first to say it. Thomas Friedman and Zibigniew Brzezinski have now both said it on the same day. Foreign policy guru Senator Richard Lugar has said it. The conventional wisdom is unequivocal: the Bush administration has yet to make the case for invading Iraq and deposing Saddam Hussein.

Taken literally, this assertion is complete nonsense. The administration has laid out a clear and compelling case for the use of military force to get rid of Saddam Hussein. His regime is one of the most brutal in a generally brutal region; he has a long history of invading neighbors, building weapons of mass destruction (including biological, chemical and even nuclear weapons) and using them (in the case of chemical weapons, at least) against both internal and external enemies. He is a longtime supporter of international terrorism, including, most likely, Al Qaida. His agents have attempted to assassinate a former US president, and may even have been in direct contact with the September 11th hijackers. Numerous regimes with far less damning records than that have been targeted--quite sensibly and justly--for violent overthrow by past US governments.

On the other hand, the case against toppling Saddam Hussein has been heard only very softly, if at all, in the (mainstream) public square over the last few months. And that's a bit surprising, because there is a fairly strong case to be made. Hussein still has a formidable army that is capable of inflicting serious casualties on an invading force. And he has that arsenal of non-conventional weapons; faced with the prospect of elimination, he may well decide he has nothing to lose and unleash them. The death toll of an invasion could thus be extremely high--perhaps catastrophically so.

There is also a plausible alternative: regimes that are too powerful to destroy militarily are usually handled with a "containment" strategy. When well-executed, such a strategy can control and limit the regime's threat to the rest of the world, at relatively little risk and cost, until such time as conditions for its collapse or removal are more favorable. If "red lines" regarding its external behavior are explicitly drawn and forcefully backed up with harsh military responses short of full-scale war, they can sharply limit the hostile regime's capacity to make trouble. To this day, for example, just such a set of rules has protected the Kurdish sanctuary in northern Iraq from Hussein's troops; a broader version of this arrangement might conceivably succeed in containing him fully until old age, internal dissention or a well-placed rival takes its toll.

It's interesting to consider, then, why so many pundits, instead of stridently presenting this anti-invasion case, have satisfied themselves with carping about the supposed lack of solid pro-invasion arguments coming from the administration. One explanation is obvious: the US military's recent string of three spectacularly easy, relatively painless victories (in the Gulf War, Serbia/Kosovo and Afghanistan) has (probably unfairly) discredited the "body bags" argument. After three instances of worst-case scenario-believers crying wolf, it's much harder to claim--even when it's justified--that this time, the danger is real.

But the "body bags" argument's political problems go deeper than just its recent history of being refuted by events. Many of those past solemn cautions about the threat of huge casualties in Kuwait, Serbia and Afghanistan were in fact disingenuously inflated doomsday predictions coming from reflexive pacifists who wouldn't have supported those military actions even if they had been guaranteed casualty-free. The "quagmire" warning has thus become entangled, in public debate, with a broad geopolitical ideology that includes perennial strong distaste for (American) military action--regardless of the cost--and a preference for multilateral diplomatic "solutions" that amount to inaction and even active appeasement in the face of aggression. A perfect expression of this ideology can be found, surprisingly enough, in Brent Scowcroft's recent WSJ op-ed, which is amazingly retro in tone; like a 1980's peacenik or 1990's paleoliberal, he prattles on about the importance of "international cooperation", a potential "explosion of outrage" in the Arab world over America's placing the Iraq problem ahead of the Arab-Israeli conflict, and the need for "pressing the United Nations Security Council to insist on an effective no-notice inspection regime for Iraq".

Since September 11th, that approach has been relegated (in America, at least) to a pacifist fringe; Americans are more likely these days to consider it a cause of than a solution to the resurgent terrorist threat they've been confronting of late. Politicians and pundits are therefore terrified of even appearing to embrace such a discredited worldview--by, for example, voicing more moderate qualms about military action that have been tainted by association with it. (No doubt that's why, as Frank Rich has noted, public doubt about the wisdom of an invasion of Iraq has come primarily from Republicans like Scowcroft, who are presumably more insulated--by virtue of their party's hawkish recent past--than Democrats against the charge of one-world Euroweenie woolly-headedness.) That's a shame, because the tough-minded, containment-based case against massive military action is as legitimate, and just about as credible, as the case for invasion itself, and therefore ought to be made and discussed more seriously--this time, at least.

Tuesday, August 13, 2002

It was bound to happen sooner or later, of course; the parade of steadily shriller and shriller denunciations of "corporate malfeasance" in the press have clearly all been heading in this direction. But who'd have expected a distinguished economist like Paul Krugman, of all people, to break the taboo by being the first (that I've seen, at least) actually to blame a company's executives directly for the terrible crime of permitting its stock price to rise too high, and then fall too far?

The company in question is Cisco. Says Krugman, "its market capitalization has fallen by more than $400 billion. Nobody from Cisco management — ranked No. 13 in Fortune's 'greedy bunch' — has been arrested." And to think that right here, in America, a stock's price can plummet, and the government just lets it happen!

Not that its current, reduced price is the problem, of course; at its height, admits Krugman, "[s]ome analysts flatly called Cisco a pyramid scheme." In fact, that's Krugman's real complaint: there was an "illusion of profitability" that "sustained the stock price", an illusion that the company conjured up by "using its own overvalued stock as currency — paying its employees with stock options, acquiring other companies by issuing more stock. Thanks to loopholes in the accounting rules — loopholes defended with intense lobbying — these transactions allowed executives to progressively dilute the stake of their original shareholders, without ever declaring this dilution as a business cost."

Now to some people, issuing more stock when it's grossly overvalued and using it to lure employees and buy other companies might seem like an astute way to try to preserve shareholders' long-term value in the midst of a bubble. What were Cisco executives supposed to do, after all, while investors were bidding up its stock price to ridiculous levels--use corporate funds to buy back its own shares at the inflated price? No, if customers are willing to overpay for your product, then you're betraying your shareholders by charging any less than the market will bear. And likewise, if there's a huge market demand for your stock at an excessive price, then creating more of it and selling it to new investors is the only way to fulfill your duty to your current shareholders (and to the new ones, for that matter, once their stupidity in coming aboard is behind them).

Of course, the bubble that carried Cisco stock to such absurd heights also enriched its executives (and other employees), through their stock options. And that really cheeses Krugman off: "the Cisco story...demonstrates just how much self-enrichment corporate insiders can get away with while staying within the letter of the law." What self-enrichment? Cisco's CEO, John Chambers, "was among the world's best-paid executives, receiving $157 million in 2000." Got that? In 2000. When the stock price was high, Chambers' fellow shareholders were all making a fortune along with him, and none of them was uttering a peep of complaint about Chambers' windfall. In fact, either Chambers kept most of his holdings and options too long--in which case he also lost a colossal bundle in the ensuing stock slide--or he liquidated most of his holdings and exercised most of his options at their peak value, in which case anyone who didn't follow his example (which would have been publicized, by law, as an insider transaction) was a fool.

And that's the real point. Nowhere--nowhere--in his column does Krugman so much as hint that Cisco executives falsified or concealed any information from anyone. Rather the felony of which they stand accused is that of failing to stop their stock from skyrocketing, attempting to maximize the company's long-term gain from its own high stock price, and then watching helplessly as that price fell back to earth. To Krugman (and to most of his fellow fulminators about "corporate malfeasance"), the entire stock bubble itself was clearly the fault of "unscrupulous" corporate CEOs, crooked accountants, lax regulators--anybody but the idiot investors who cluelessly poured their money into horribly overpriced stocks during the bubble years. Investors, that is, like a great many of Paul Krugman's (and his fellow fire-and-brimstone journalists') naive, confused, irate, savings-depleted readers, now searching for a scapegoat--and being shamelessly pandered to by a once-respected economist.

Monday, August 12, 2002

I'ts far less contagious than influenza, causes no symptoms in 80 percent of the people it infects, and is a serious threat only to the elderly and immune-impaired (putting it on roughly the same footing as certain fungal infections). So why all the hysteria about the West Nile virus?

My theory: It's all in the name. My imaginary uncle the Beverly Hills doctor ("G.P. to the stars") would almost surely have said (had he actually existed) that to get noticed, a disease first needs a catchy monicker--preferably one with a hint of dangerous-sounding exoticism. I'd say "West Nile" stacks up pretty well on that score, conjuring up images of destitute Egyptian peasants bathing in a highly septic river, or perhaps of adventurous tourists O.D.-ing on Immodium after buying cut-rate local "bottled" water.

Of course, had it been called "West Congo" instead, there'd be panic in the streets by now, even if the illness only involved the tiniest bit of bleeding out the eyeballs in the vast majority of cases. On the other hand, "Westchester" virus could easily have passed into endemic status without anyone even noticing. And "West Hollywood" virus would have everyone lining up around the block to get infected.

Just my two cents, as Larry King would say....

Friday, August 09, 2002

I don't normally link to articles just (or even mostly) to rave about them. But David Brooks' gently satirical sketch of America's "sprinkler cities" and their inhabitants, the species known as "Patio Man", is worth singling out. Its deft prose and dead-on social perspicacity make it the equal of any Tom Wolfe piece, but without any of the potent venom that can make Wolfe so discomfiting to read.

Brooks' subjects are the brand new, carefully planned "exurbs" that have sprung up well beyond the perimeters of the suburbs surrounding America's big cities. Populated predominantly by middle-class families, retirees and post-industrial high-tech and service industries, these communities represent a new and distinct demographic that has yet to be courted and analyzed by political operatives and pundits. In fact, they may well be the main host population for the blogging phenomenon--a neglected political cohort with the means to express itself on its own initiative, using modern technology.

What are Patio Man's politics? Brooks characterizes them as solidly Republican, culturally traditional, focused on entrepreneurial and educational achievement and fanatical about social and community harmony. But they are not the economically subordinate "red state" working-class right I have been claiming represents one half of the bipolar modern American polity. Rather, they seem to be a third, "middle" tier, politically allied with their economic inferiors, but (according to Brooks) fleeing the gradual penetration and downscaling of the suburbs by unruly immigrants and blue-collar workers as much as they're fleeing the snobbery of the new suburban liberal overclass. Patio Man is thus clearly a pure product of the nineties boom, achieving his intermediately successful economic status as a result of that decade's explosive spawning of mobile modern high-tech and service-industry start-ups.

It thus remains to be seen whether Patio Man can survive the current economic downturn. Mortgaged to the hilt and employed in a "new economy" enterprise of dubious durability, he can little afford higher interest rates, falling home prices or weakening consumer and corporate spending--let alone the impending combined onset of all three. It may be, then, that sprinkler cities will become the ghost towns of the early twenty-first century, as their residents abandon their forclosed or negative-equity homes and shuttered employers in favor of lower-paying jobs with greater stability, leaving the mega-malls to die of customer neglect, the perfectly-kempt lawns to wither unsprinkled, and the dreams of a prosperous white-collar conservatism shattered.

Friday, August 02, 2002

The latest in cosmetic surgery, apparently, involves navel display--converting "outie" belly buttons to "innies".

But what if you want a star on your belly?
Times have certainly changed. A couple of years ago, when the market was skyrocketing and investors seemingly couldn't go wrong, it was often argued that stock-picking was for chumps--after all, why throw away a percent or two of savings growth a year on managers' fees, or gamble everything on an insufficiently broad portfolio, when an index fund virtually guarantees near-double-digit returns over the long term? Well, now that the market's tanking, Daniel Gross is singing a different tune in Slate, criticizing the S&P 500 index for, of all things, being lousy at picking stocks--adding companies to the index just as their share prices peak, and then keeping them on the roster as they collapse.

The charge is, of course, utterly absurd; the S&P 500 is supposed to track the large-cap market, not beat it, and if investors are pumping up dubious tech stocks to absurd valuations, then it's not S&P's place to second-guess their judgment. Moreover, the index investing dogma of the nineties asserted adamantly that such fluctuations shouldn't matter anyway to the long-term buy-and-hold investor, who could look forward to excellent returns by explicitly not timing the market, buying instead at a constant rate ("dollar-cost averaging") year in and year out, and counting on the market's historically reliable high return rate to do its work.

The flaw in this strategy--as we have now all been so painfully reminded--lies in its assumption that stocks' consistently higher return rates are a blessing that simply drops out of the sky into the laps of delighted passive investors. In fact, their true source is the vigilantly skeptical discipline of history's (overwhelmingly active, portfolio-managing) investors, who have made a practice of avoiding stocks whose high prices precluded a sufficient return on their invested purchase price. By refusing to overpay, those finicky investors kept stock prices (relative to earnings) down, and thus returns (per dollar invested) high.

Passive (or uninformed) investors, on the other hand, show no such discipline; they robotically pour their money into index (or, for that matter, actively managed all-stock) funds, regardless of current share prices or prospective corporate earnings. And as their numbers increase, they can exert a significant influence on those share prices, lifting them to the point where the potential earnings (again, per dollar invested) of the companies they represent pale in comparison with the earning power of, say, Treasury bills.

Eventually, of course, sharp-eyed investors begin to notice that they're being fleeced and head for the exits, bursting the bubble and stranding the remaining shareholders with stock worth a tiny fraction of its grotesquely overinflated purchase price. That's exactly what happened to most of the millions of American investors who collectively lost trillions of dollars in the market over the last several years. By comparison, the effects of badly-timed additions to or deletions from the S&P 500 index are a piddling, deck-chairs-on-the-Titanic irrelevancy.

Wednesday, July 31, 2002

You might have been wondering, during the debate over government subsidies to Amtrak, how the money might otherwise have been spent. Well, here are some relevant figures: Amtrak was granted a $520 million capital subsidy in 2001, plus another $105 in "safety and operations" funding (another $130 million-odd was spent on sundry other passenger rail projects, including "research and development"). The previous year, Amtrak supplied about 5.5 billion passenger miles of rail transportation; total passenger rail subsidies thus amounted to about 13.5 cents per passenger mile.

In 2000 (the most recent year for which figures appear to be available), Greyhound bus lines took in a total of about $1 billion in revenue, while providing nearly 8 billion passenger miles of bus transportation, for a total of about 13 cents per passenger mile (and that's assuming zero profit and zero revenue from non-passenger sources such as its package-shipping service). Meanwhile, Continental Airlines brought in 13.44 cents of revenue per passenger mile; United's figure was 13.25 cents, and American's was 14.05 cents. (Since 2000, airlines' revenues per passenger mile have declined, while Amtrak's subsidy per passenger mile has increased. Greyhound's parent company, Laidlaw, has been granted bankruptcy protection, so Greyhound's recent revenue figures are hard to come by.)

Remember, these are revenue figures--the amount these companies charged for the transportation they provided. (And all were profitable during the year in question.) In other words, the government's 2001 subsidy of passenger rail would have been approximately sufficient to pay for all of Amtrak's passengers the previous year to take a bus or plane instead--absolutely free.

On the other hand, it's not entirely certain that Amtrak's passengers would have taken this deal, even if they could somehow have been identified and offered the switch. After all, Amtrak is hardly the price-sensitive traveler's first choice; its fares on the most popular routes are typically slightly above the best available airfare (as a bit of experimentation with Amtrak's and Expedia's Websites will show), let alone the cost of a bus ticket. More likely, rail passengers are opting for comfort and convenience--particularly compared to Greyhound, whose highly, uh, "price-conscious" clientele may be exactly what Amtrak's passengers are looking to avoid. Viewed in this light, the massive federal subsidy provided to Amtrak's affluent, luxury-seeking customers is, pace the "egalitarian" mass transit ideologues, an appalling example of expensive, wasteful government favoritism towards the well-to-do.

Sunday, July 28, 2002

Bloggers the world over, led by Instapundit Glenn Reynolds, are hooting in celebration of John Leo's proclamation, in an entire column on blogging in the oh-so-mainstream US News and World Report, that "[t]he main arena for media criticism....will be the Internet." Now, I'm as enthused about the blogging phenomenon as anyone (okay, as anyone I know), but in understanding its growth and influence, it's important to separate form from content.

A stroll through the "blogosphere", as it's called, reveals that a very large fraction of blogs are nothing more than the appallingly tedious, self-centered ramblings of appallingly tedious, self-centered people (whether "I Could Be Wrong" falls into this category is of course a question for you, the reader, to decide). Many more are simply Web versions of the moderated electronic bulletin board, a medium that has been used for decades now by technical specialists and hobbyist-enthusiasts to disseminate information and host discussions targeted at a particular narrow audience.

The relatively small collection of political and media-critic blogs that have risen to such spectacular popularity and prominence lately are simply new-technology equivalents of the low-budget "alternative" publications that sprang up in the political ferment of the late '60's and early '70's. Like bloggers, "alternative" journalists used the cheapest publication medium at hand to give expression to a nascent political faction that just happened to catch on and mount a serious challenge to one of the dominant political coalitions of the day. Back then, it was a radical-left takeover by educated upscale youth of the previously working-class, mildly populist institutions of the liberal-Democratic coalition; today, educated, upscale youth on the libertarian right are hoping to conquer the distinctly working-class, mildly populist institutions of the conservative-Republican coalition.

If they succeed, then blogs will be the Rolling Stone, Village Voice and urban "alternative weeklies" of the early 21st century--an established, commercially successful media vehicle catering to a particular political demographic. And if they fail, then blogging will likely simply return to its obscure specialist-and-vanity roots, until the next political upheaval--or the next technology--takes over. Either way, it will be the content it carries, not the blog form itself, that will determine its fate.