Perhaps the saddest postscript to the appalling murder of journalist Daniel Pearl was his father's anger at the Israeli press for revealing his son's Israeli citizenship, thus possibly discouraging Pakistani cooperation with the investigation into his death. Back in 1948, when the state of Israel was founded, Jews the world over dared to hope that after 2000 stateless years, they would at last be freed from their constant fear of being singled out as targets of rage and violence. With Israel as both a symbolic and physical refuge from attack, they thought, they might finally be able to display their identity openly, safely and with pride. Instead, an Israeli professor in America is now desperately trying to cover up his family's roots, fearing the repercussions of the new anti-Semitism, which, far from succumbing to Israel's existence, has merely grown to encompass it.
Senseless hatred is an eternal, ubiquitous human sin, and as Americans have recently learned, it targets the strong and the weak alike. Nobody understands its persistence better than its longest-suffering victim: the very nation that first conceived, long ago, the beautiful dream that it might one day disappear forever.
Sunday, February 24, 2002
Monday, February 18, 2002
In a longwinded, snide diatribe against Bill Moyers, The Weekly Standard's Stephen F. Hayes both misses and demonstrates the damning case against his target. Hayes, like a good investigative reporter, "follows the money", and demonstrates that--well, Moyers makes a lot of it, and runs a foundation that gives a lot of it out. Hayes never really explains why anybody should complain about that (although he does note the irony of Moyers screaming about obscene profiteers from the horror of September 11th, while himself refusing to specify just how much he's made off his television series and related materials on the subject). Moyers' reply also takes it as a given that Hayes' characterization is (unfairly) damning, without explaining what about it, exactly, is so objectionable.
But one small incident in Hayes' article is much more telling than all his casting of aspersions about Moyers' tycoonhood: asked about his 2001 PBS documentary on the chemical industry in which no voices from said industry were heard for the first ninety minutes of the broadcast, Moyers defended his decision, saying, "This is an investigative documentary, not a debate,,,,We wanted to make sure of our reporting and make sure we had our facts laid down and then we wanted the industry to have the chance to respond to our reporting."
Now, I have no fundamental objections to open, earnest advocacy in journalism; some of journalism's great masterpieces, after all, have taken that form. And of course I also heartily applaud the efforts of those journalists who strive to make their reporting a dispassionately objective account of the facts--or, better still, to present the various sides of a conflict with careful neutrality. The hybrid known as "investigative journalism", however, falls into neither of these categories. (The term itself is suspiciously redundant, since the vast majority of journalistic work obviously includes at least some component of investigation.) In practice, it actually combines the worst of both worlds: open advocacy unscrupulously disguised as dispassionate reporting.
The form is founded on a simple underlying narrative: an intrepid, crusading journalist "uncovers" the alleged evil activities of some respected person or group. The reportorial details are actually secondary to the moral and dramatic thrust of the story, in which a previously unblemished pillar of society is gradually revealed to be a perpetrator of monstrous malfeasance. The supposed zenith of investigative journalism was the Watergate scandal; since then, most journalists believe, it has suffered a steady decline that reached its nadir with the Monica Lewinsky scandal. But the two stories (and that's what they were, first and foremost--stories) both stuck resolutely to the investigative script. Indeed, the political survival of President Clinton stemmed directly from his success at peddling an alternative "investigative" story--the alleged perfidy of Special Prosecutor Kenneth Starr.
It is this manichaean worldview--not the source or the management of his funds--that is the greatest flaw in Bill Moyers' journalistic output, whether his topic is the chemical industry, campaign finance reform, or anything else. And Stephen Hayes would have been well-advised to address it head-on, either objectively or as an open partisan, instead of launching his own little attempt at muckraking. Sadly, it seems that straying from the "investigative" formula is too much to ask of a journalist these days. Luckily for readers of this blog, I'm not a journalist.
But one small incident in Hayes' article is much more telling than all his casting of aspersions about Moyers' tycoonhood: asked about his 2001 PBS documentary on the chemical industry in which no voices from said industry were heard for the first ninety minutes of the broadcast, Moyers defended his decision, saying, "This is an investigative documentary, not a debate,,,,We wanted to make sure of our reporting and make sure we had our facts laid down and then we wanted the industry to have the chance to respond to our reporting."
Now, I have no fundamental objections to open, earnest advocacy in journalism; some of journalism's great masterpieces, after all, have taken that form. And of course I also heartily applaud the efforts of those journalists who strive to make their reporting a dispassionately objective account of the facts--or, better still, to present the various sides of a conflict with careful neutrality. The hybrid known as "investigative journalism", however, falls into neither of these categories. (The term itself is suspiciously redundant, since the vast majority of journalistic work obviously includes at least some component of investigation.) In practice, it actually combines the worst of both worlds: open advocacy unscrupulously disguised as dispassionate reporting.
The form is founded on a simple underlying narrative: an intrepid, crusading journalist "uncovers" the alleged evil activities of some respected person or group. The reportorial details are actually secondary to the moral and dramatic thrust of the story, in which a previously unblemished pillar of society is gradually revealed to be a perpetrator of monstrous malfeasance. The supposed zenith of investigative journalism was the Watergate scandal; since then, most journalists believe, it has suffered a steady decline that reached its nadir with the Monica Lewinsky scandal. But the two stories (and that's what they were, first and foremost--stories) both stuck resolutely to the investigative script. Indeed, the political survival of President Clinton stemmed directly from his success at peddling an alternative "investigative" story--the alleged perfidy of Special Prosecutor Kenneth Starr.
It is this manichaean worldview--not the source or the management of his funds--that is the greatest flaw in Bill Moyers' journalistic output, whether his topic is the chemical industry, campaign finance reform, or anything else. And Stephen Hayes would have been well-advised to address it head-on, either objectively or as an open partisan, instead of launching his own little attempt at muckraking. Sadly, it seems that straying from the "investigative" formula is too much to ask of a journalist these days. Luckily for readers of this blog, I'm not a journalist.
Friday, February 15, 2002
I have a wonderful idea for a television show: a small group of participants would be gathered together at some exotic location for a series of contrived physical competitions. The contests themselves would be a mere pretext, of course; the core of the show would involve a voting system by which the participants expressed approval or disapproval for each other. By forming alliances and politicking vigorously, they could conspire to eliminate weak or unpopular members of the group--regardless of their showing in the competitions--until the person able to establish the strongest coalition would ultimately be crowned champion.
I know, there's already a show just like that. But mine would be televised more often than once every four years, and the competitive events wouldn't be limited to figure skating.
I know, there's already a show just like that. But mine would be televised more often than once every four years, and the competitive events wouldn't be limited to figure skating.
Thursday, February 14, 2002
The leadership of the American Bar Association have apparently just passed a resolution demanding that captured members of Al Qaida be treated like common criminals, and the Wall Street Journal's editorialists are angry at them for it. More specifically, the ABA has demanded that the standards of justice applied to bin Laden's operatives be those of domestic criminal courts, not military tribunals. What they are implicitly admitting, of course--along with the Journal and pretty much every other conscious being in the known universe--is that common criminals in America receive far more lenient treatment than, say, those accused under military jurisdiction (the latter including not only POWs but also friendly soldiers).
There was a time, mind you, when criminals were treated like--well, criminals, and soldiers (of any nation) were actually accorded, in some respects, a higher degree of dignity. Today, though, military tribunals are notorious chiefly for their habit of actually convicting and punishing wrongdoers, whereas the civilian criminal justice system is equally notorious for allowing scofflaws free rein on the oddest pretexts. What should be disturbing to Americans is not that Al Qaida terrorists might end up being tried in one or the other venue, but rather that one of those court systems--the one ordinary citizens count on to protect them from domestic thugs and hoodlums--is understood to be quite capable of offering the terrorists a relatively cushy deal. Sure makes one sleep well at night....
There was a time, mind you, when criminals were treated like--well, criminals, and soldiers (of any nation) were actually accorded, in some respects, a higher degree of dignity. Today, though, military tribunals are notorious chiefly for their habit of actually convicting and punishing wrongdoers, whereas the civilian criminal justice system is equally notorious for allowing scofflaws free rein on the oddest pretexts. What should be disturbing to Americans is not that Al Qaida terrorists might end up being tried in one or the other venue, but rather that one of those court systems--the one ordinary citizens count on to protect them from domestic thugs and hoodlums--is understood to be quite capable of offering the terrorists a relatively cushy deal. Sure makes one sleep well at night....
Wednesday, February 13, 2002
Mickey Kaus points out an interesting aspect of a loophole proposed by House supporters of campaign finance reform. The new "millionaire opponent" loophole allows candidates to triple their expenditures, among other concessions, if they are facing an opponent spending at least $350,000 from his or her personal weath on the campaign. The rule is ostensibly intended to reduce the advantage of individual rich candidates--who may spend as much of their own money as they please--over those who must meet their expenses by collecting donations. In fact, as Kaus points out, it could also make it possible for a wealthy backer to contribute $350,000 to a campaign while simultaneously increasing the supported candidate's spending power--by posing as an "independent" opposing candidate rather than as a mere supporter, and spending the money on "campaign ads" that actually favor the other candidate.
This loophole in the loophole illustrates yet again the basic problem with "demand-side" campaign finance reform: there's nothing really wrong with spending lots of money expressing political opinions like, "I think everyone should vote for me". That's why the "millionaire opponent" problem can't be solved by simply outlawing millionaire opponents; a law forbidding an individual to spend his or her own money to express such an opinion would be a blatantly stifling restriction on exactly the kind of free speech (indeed, arguably the only kind) that is vital to democracy. (For the same reason, many observers of campaign finance reform, including Kaus--not to mention myself--have long pointed to the "independent expenditure" loophole as a reform-killer. Who, after all, is going to prevent a rich person--or rich organization--from independently deciding to buy ads saying, "I think candidate X is dead right on a the following issues", or "X's opponent is all wrong on the following issues"?).
The solution, of course, is to forget about the "demand side" altogether, and concentrate on the "supply side", since that's where the corruption supposedly occurs, anyway--rich people or organizations buying political favors with huge campaign contributions. (And a lot of it probably does occur--not with respect to high-profile issues, where attentive public opinion forces politicians to pander to voters over donors, but rather when obscure but crucial laws are being decided--details of the tax code, for instance.) Suppose, for example, that the government simply reimbursed each candidate's campaign expenses up to some very large limit--enough to run a serious modern, television-oriented campaign--provided the candidate achieved some threshhold percentage of the popular vote. He or she could still legally accept private donations into their war chests, of course, but it would no longer be necessary, and soon enough a candidate would make a big issue out of not accepting any. (That can't happen today, because everybody is forced to go begging to wealthy donors--whether individuals or organizations--in order to be able to afford even a shoestring campaign, and saying, "my sugar-daddies are less venal than yours" will never be nearly as convincing as "you're being bribed and I'm not".) Even better, the FCC could require television networks, as a condition of license renewal, to reimburse paid airtime (or simply donate it, as they must in many other countries) according to a similar formula.
Unfortunately, Americans would apparently rather let their politicians sell themselves to the highest bidder, or even muzzle them outright, than pay the relative pittance it would cost to make political campaigns free and honest . Go figure.
This loophole in the loophole illustrates yet again the basic problem with "demand-side" campaign finance reform: there's nothing really wrong with spending lots of money expressing political opinions like, "I think everyone should vote for me". That's why the "millionaire opponent" problem can't be solved by simply outlawing millionaire opponents; a law forbidding an individual to spend his or her own money to express such an opinion would be a blatantly stifling restriction on exactly the kind of free speech (indeed, arguably the only kind) that is vital to democracy. (For the same reason, many observers of campaign finance reform, including Kaus--not to mention myself--have long pointed to the "independent expenditure" loophole as a reform-killer. Who, after all, is going to prevent a rich person--or rich organization--from independently deciding to buy ads saying, "I think candidate X is dead right on a the following issues", or "X's opponent is all wrong on the following issues"?).
The solution, of course, is to forget about the "demand side" altogether, and concentrate on the "supply side", since that's where the corruption supposedly occurs, anyway--rich people or organizations buying political favors with huge campaign contributions. (And a lot of it probably does occur--not with respect to high-profile issues, where attentive public opinion forces politicians to pander to voters over donors, but rather when obscure but crucial laws are being decided--details of the tax code, for instance.) Suppose, for example, that the government simply reimbursed each candidate's campaign expenses up to some very large limit--enough to run a serious modern, television-oriented campaign--provided the candidate achieved some threshhold percentage of the popular vote. He or she could still legally accept private donations into their war chests, of course, but it would no longer be necessary, and soon enough a candidate would make a big issue out of not accepting any. (That can't happen today, because everybody is forced to go begging to wealthy donors--whether individuals or organizations--in order to be able to afford even a shoestring campaign, and saying, "my sugar-daddies are less venal than yours" will never be nearly as convincing as "you're being bribed and I'm not".) Even better, the FCC could require television networks, as a condition of license renewal, to reimburse paid airtime (or simply donate it, as they must in many other countries) according to a similar formula.
Unfortunately, Americans would apparently rather let their politicians sell themselves to the highest bidder, or even muzzle them outright, than pay the relative pittance it would cost to make political campaigns free and honest . Go figure.
Monday, February 11, 2002
The case of Sami al-Arian is one of those massive collisions of grandiose principles that inevitably ends with logic and consistency lying mangled in the middle of the road. Al-Arian, a professor of computer science at the University of South Florida for the past decade and a half, had, according to investigative journalist Steven Emerson, a rather unconventional hobby: he was a local organizer and fundraiser for the Palestinian terrorist group Islamic Jihad. Until recently, his employers never bothered to interfere with his extracurricular activities. But after FOX commentator Bill O'Reilly brought them to light in the course of a television interview with him in late September of 2001, all hell broke loose on campus, and USF president Judy Genshaft decided to fire him for straying "outside the scope of his employment" in a way that was disruptive to the smooth functioning and security of the university.
This action in turn prompted howls of outrage, not only from the left (all the way from The Progressive, crying "McCarthyism", to the New York Times, defending "[f]ree speech and academic freedom"), but also on the right, with Ronald Radosh arguing that Genshaft's stand, if emulated, "would in fact lead to the end of the free university in the United States", and Daniel Pipes dismissing her arguments as "poor excuses". Both of the latter gentlemen, on the other hand, argued, along with David Tell of the Weekly Standard, that al-Arian should have been fired anyway, and perhaps worse, for providing active help to a terrorist group.
All of these commentators, with the exception of Tell, explicitly invoked either the First Amendment or the principle of academic freedom in arguing against the university's claimed grounds for firing him. Moreover, in keeping with their reading of these freedoms, none of them attached the slightest importance to whether any of al-Arian's words or deeds ever bled into his university activities, or whether they were all kept entirely separate. In fact, al-Arian apparently played some role in arranging for one of Islamic Jihad's front organizations to become affiliated with USF, and also tried to get a member of that organization, one Ramadan Abdullah Shallah, hired as a professor of Middle East studies there. (Shallah eventually moved to Syria, where he became secretary-general of Islamic Jihad--though probably without tenure.)
But putting aside, for the moment, the specific case at hand: what if, say, a USF janitor were discovered to be spending his spare time as an activist and public advocate for (or simply a member of) the Ku Klux Klan, or a club specializing in violent pornography, or some other legal-but-widely-reviled organization? Conversely, what if a USF professor were to wangle a USF affiliation, and its attendant privileges, for a "think tank" that was actually secretly an arm of, say, the Republican Party, and to attempt to sneak a local Republican Party chair into a professorship in the department of political science? Wouldn't the university's academic integrity and stature be much more threatened, and its adminstration thus much more justified in taking action, in the latter case than in the former?
The shocking truth is that until this past fall, so miserably did America fail to take seriously the terrorist threat that al-Arian's extensive efforts on behalf of Islamic Jihad were in all likelihood perfectly legal. And post-9/11 sensibilities notwithstanding, it is not the proper role of any employer to decide on the legality (retroactive or otherwise) of its employees' extra-vocational activities--for reasons of basic privacy that have nothing to do with either the First Amendment or academic freedom.
On the other hand, for a professor to attempt to co-opt the university's name, funds or other resources on behalf of a partisan political (let alone terrorist) organization is--or at least ought to be--a firing offense. That it is in fact an offense ubiquitously committed, often applauded and almost never punished, is merely a searing indictment of the modern university, and of the defenders of "academic freedom" who have so thoroughly corrupted it. Tenure and its accompanying protection are supposed to be granted as part of a noble bargain, in which the academic seeker-of-truth renounces all loyalties and attachments in return for the respect and protection accorded the dedicatedly detached. Instead, tenure is generally viewed as a safe, sheltered platform from which to participate unabashedly in partisan political combat--both on campus and off. Thus, a professor who attempts to insinuate a foreign political group's influence into his university runs into opposition only when said organization is crass enough to kill random civilians. And even then, his shameful betrayal of academic principles is widely defended under the cynical banner of "academic freedom".
Some agree with president Genshaft that Dr. al-Arian should have been fired. Others believe that Genshaft should have been the one ousted. As far as I'm concerned, both sides are right--and a lot more should follow them.
This action in turn prompted howls of outrage, not only from the left (all the way from The Progressive, crying "McCarthyism", to the New York Times, defending "[f]ree speech and academic freedom"), but also on the right, with Ronald Radosh arguing that Genshaft's stand, if emulated, "would in fact lead to the end of the free university in the United States", and Daniel Pipes dismissing her arguments as "poor excuses". Both of the latter gentlemen, on the other hand, argued, along with David Tell of the Weekly Standard, that al-Arian should have been fired anyway, and perhaps worse, for providing active help to a terrorist group.
All of these commentators, with the exception of Tell, explicitly invoked either the First Amendment or the principle of academic freedom in arguing against the university's claimed grounds for firing him. Moreover, in keeping with their reading of these freedoms, none of them attached the slightest importance to whether any of al-Arian's words or deeds ever bled into his university activities, or whether they were all kept entirely separate. In fact, al-Arian apparently played some role in arranging for one of Islamic Jihad's front organizations to become affiliated with USF, and also tried to get a member of that organization, one Ramadan Abdullah Shallah, hired as a professor of Middle East studies there. (Shallah eventually moved to Syria, where he became secretary-general of Islamic Jihad--though probably without tenure.)
But putting aside, for the moment, the specific case at hand: what if, say, a USF janitor were discovered to be spending his spare time as an activist and public advocate for (or simply a member of) the Ku Klux Klan, or a club specializing in violent pornography, or some other legal-but-widely-reviled organization? Conversely, what if a USF professor were to wangle a USF affiliation, and its attendant privileges, for a "think tank" that was actually secretly an arm of, say, the Republican Party, and to attempt to sneak a local Republican Party chair into a professorship in the department of political science? Wouldn't the university's academic integrity and stature be much more threatened, and its adminstration thus much more justified in taking action, in the latter case than in the former?
The shocking truth is that until this past fall, so miserably did America fail to take seriously the terrorist threat that al-Arian's extensive efforts on behalf of Islamic Jihad were in all likelihood perfectly legal. And post-9/11 sensibilities notwithstanding, it is not the proper role of any employer to decide on the legality (retroactive or otherwise) of its employees' extra-vocational activities--for reasons of basic privacy that have nothing to do with either the First Amendment or academic freedom.
On the other hand, for a professor to attempt to co-opt the university's name, funds or other resources on behalf of a partisan political (let alone terrorist) organization is--or at least ought to be--a firing offense. That it is in fact an offense ubiquitously committed, often applauded and almost never punished, is merely a searing indictment of the modern university, and of the defenders of "academic freedom" who have so thoroughly corrupted it. Tenure and its accompanying protection are supposed to be granted as part of a noble bargain, in which the academic seeker-of-truth renounces all loyalties and attachments in return for the respect and protection accorded the dedicatedly detached. Instead, tenure is generally viewed as a safe, sheltered platform from which to participate unabashedly in partisan political combat--both on campus and off. Thus, a professor who attempts to insinuate a foreign political group's influence into his university runs into opposition only when said organization is crass enough to kill random civilians. And even then, his shameful betrayal of academic principles is widely defended under the cynical banner of "academic freedom".
Some agree with president Genshaft that Dr. al-Arian should have been fired. Others believe that Genshaft should have been the one ousted. As far as I'm concerned, both sides are right--and a lot more should follow them.
Sunday, February 10, 2002
I've long argued that the four-tier structure that dominated American politics from the late sixties through the early nineties (the wealthy/blue-collar "conservative" alliance vs. the poor/white-collar "liberal" alliance) has since given way to the better-known, simplified two-level "blue-red" polarization. (During the recent economic boom, the interests of the investment-heavy white-collar middle class merged with those of the weathy to form the ruling "blues", and the now-employed poor merged with the struggling blue-collar middle class to form the subordinate "reds".) But the old alignment still resonates powerfully for those whose political beliefs were formed under its influence. A conspicuous example is the bizarre conflation of upper-crust elite disdain for ordinary people with lower-class ignorance of middlebrow culture that seems to grip liberal critics whenever George W. Bush comes up in conversation.
The confusion may have originated with George Bush Sr., who tried (unsuccessfully) to accommodate both prongs of the old conservative alliance by alternating between his Yale-educated statesman's mantle and a pork rind-eating Texan persona. But the younger George's image suffers from no such ambiguity; he embraces his red(neck)-state populism wholeheartedly, without a trace of Yalie snobbery. Thus the double-barrelled ("upper-class/lower-class") assault aimed implicitly at the father (you can hear Bill Maher repeating it just about any night on "Politically Incorrect") is even more jarring when directed at the son.
A perfect example is this article in the New York Daily News about an upcoming Frank Bruni biography of the president. Bruni apparently characterizes Bush as "a stranger to America outside his own upper-class WASP background". What evidence does he unearth to illustrate this aristocratic detachment from the common people? Well, "Bush viewed the musical 'Cats' as modern theater at its finest", and "openly admitted that martial artist Chuck Norris was his favorite film actor". He knew nothing about HBO's "Sex and the City", and was unfamiliar with words like "vegan" and "yenta". He likes peanut butter sandwiches, Fritos and Cheez Doodles.
It's not surprising, of course, that snobbish New Yorkers would recoil in disgust at such plebeian tastes. What's absurd is the suggestion that there is even a trace of resentment of privilege mixed in with their arrogant disdain. If proof were still needed of America's completed transition from interlocking socioeconomic alliances to a straightforward two-class division--with the current president among, and representing, the lower class, regardless of his personal fortune--this article surely clinches the argument.
The confusion may have originated with George Bush Sr., who tried (unsuccessfully) to accommodate both prongs of the old conservative alliance by alternating between his Yale-educated statesman's mantle and a pork rind-eating Texan persona. But the younger George's image suffers from no such ambiguity; he embraces his red(neck)-state populism wholeheartedly, without a trace of Yalie snobbery. Thus the double-barrelled ("upper-class/lower-class") assault aimed implicitly at the father (you can hear Bill Maher repeating it just about any night on "Politically Incorrect") is even more jarring when directed at the son.
A perfect example is this article in the New York Daily News about an upcoming Frank Bruni biography of the president. Bruni apparently characterizes Bush as "a stranger to America outside his own upper-class WASP background". What evidence does he unearth to illustrate this aristocratic detachment from the common people? Well, "Bush viewed the musical 'Cats' as modern theater at its finest", and "openly admitted that martial artist Chuck Norris was his favorite film actor". He knew nothing about HBO's "Sex and the City", and was unfamiliar with words like "vegan" and "yenta". He likes peanut butter sandwiches, Fritos and Cheez Doodles.
It's not surprising, of course, that snobbish New Yorkers would recoil in disgust at such plebeian tastes. What's absurd is the suggestion that there is even a trace of resentment of privilege mixed in with their arrogant disdain. If proof were still needed of America's completed transition from interlocking socioeconomic alliances to a straightforward two-class division--with the current president among, and representing, the lower class, regardless of his personal fortune--this article surely clinches the argument.
Saturday, February 09, 2002
The outburst of vituperation of American foreign policy that has recently emanated from a collection of European political officials is remarkable--not for its hectoring tone (I discussed that in a previous post), but for its extraordinary frankness. On-the-record language like "absolutist and simplistic", "unilateral overdrive", "more rhetoric than substance", and "hard to believe that's a thought-through policy" is not uncommon from tinpot dictators talking about each other, and occasionally heard from Western democrats talking about tinpot dictators, but one can scarcely imagine, say, anyone in the US State Department openly referring in similar terms to an industrialized democratic ally.
The frankness, moreover, is not just a matter of language; even more striking was the cold, unabashed disdain for anything that might resemble principle or idealism. EU External Relations Commissioner Chris Patten, most famous for his courageous defense of Hong Kong's nascent democratic institutions, is now standing up for "constructive engagement" towards Iran and North Korea--not because he deems it more morally justified, but because he claims that it is more likely to be effective (at what, exactly, he doesn't even bother to say). French Premier Lionel Jospin sounded a loftier note--but only slightly; although he expressed concern about excessive use of military action, his chief worry was apparently America's lack of commitment to multilateralism--a diplomatic value but hardly a fundamental moral one.
The most unrestrained commentary, though, both in its fury and in its cynicism, emanated from an unnamed EU official, who complained that "[i]t is humiliating and demeaning if we feel we have to go and get our homework marked by Dick Cheney and Condi Rice". There we have it--not even a scintilla of a pretense of high-mindedness, only the unconcealed bitter resentment of a minor functionary representing a stagnating region of scorned demi-powers. The bitterness is understandable, to be sure. And in the wake of America's spectacular success in Afghanistan--powerful testimony to both the moral and practical superiority of American muscularity over Europe's accommodationism--the traditional haughty moral excuses for criticizing America are much harder to substantiate. But if there had ever been any doubt that Europe's internationalist rhetoric about postwar American power has been motivated more by catty parochialism than by genuine principle, that doubt has now been finally, utterly erased.
The frankness, moreover, is not just a matter of language; even more striking was the cold, unabashed disdain for anything that might resemble principle or idealism. EU External Relations Commissioner Chris Patten, most famous for his courageous defense of Hong Kong's nascent democratic institutions, is now standing up for "constructive engagement" towards Iran and North Korea--not because he deems it more morally justified, but because he claims that it is more likely to be effective (at what, exactly, he doesn't even bother to say). French Premier Lionel Jospin sounded a loftier note--but only slightly; although he expressed concern about excessive use of military action, his chief worry was apparently America's lack of commitment to multilateralism--a diplomatic value but hardly a fundamental moral one.
The most unrestrained commentary, though, both in its fury and in its cynicism, emanated from an unnamed EU official, who complained that "[i]t is humiliating and demeaning if we feel we have to go and get our homework marked by Dick Cheney and Condi Rice". There we have it--not even a scintilla of a pretense of high-mindedness, only the unconcealed bitter resentment of a minor functionary representing a stagnating region of scorned demi-powers. The bitterness is understandable, to be sure. And in the wake of America's spectacular success in Afghanistan--powerful testimony to both the moral and practical superiority of American muscularity over Europe's accommodationism--the traditional haughty moral excuses for criticizing America are much harder to substantiate. But if there had ever been any doubt that Europe's internationalist rhetoric about postwar American power has been motivated more by catty parochialism than by genuine principle, that doubt has now been finally, utterly erased.
Wednesday, February 06, 2002
A little over a year ago, a young Englishwoman named Claire Swire sent a naughty email to her then-current beau, a Mr. Bradley Chait, which in a moment of bravado he recklessly forwarded to a few close male friends. One of them apparently passed it on in turn, and soon pretty much the whole world (including readers of several newspapers) knew more about this couple's private life than any normal person would want to reveal. The lady was forced briefly into hiding, and the gentleman quickly became one of the most reviled men in London.
I remember thinking at the time that this was surely only the beginning of a huge flood of such events, the social rules of email having grown and developed far more slowly than its rapidly-exploding use. A similar (though far less embarrassing) event has now overtaken the conservative power couple, David Frum and Danielle Crittenden. Frum, a presidential speechwriter, is allegedly responsible for a well-known phrase spoken in the recent State of the Union address, and his proud wife decided to confide the triumph to some friends and family--one of whom, it seems, had all the discretion of Bradley Chait's friend. The result: an item by Timothy Noah in the online magazine Slate, prompting an angry scolding from Andrew Sullivan for intruding on Crittenden's privacy.
Now, it's impossible to say, really, whether Noah crossed a line in publishing this detail; after all, the lines simply haven't been drawn yet. What does one do if one comes across a compromising forwarded email? Is it the sender's fault for not choosing trustworthy recipients? Is the perfidious "first forwarder" the real villain? Or is there sin only in transferring the knowledge from the unofficial world of forwarded email to the official one of for-the-record publishing? Does a Website (or blog) count as such? Does it matter whether the original sender is a public figure? Or whether the contents involve public affairs or purely private ones?
I don't know how society will end up resolving all these questions, but I strongly suspect we'll end up mulling over many more such incidents before our collective judgment is passed. And then we can proceed to the next twist: suppose the alleged sender claims the original email was forged....
I remember thinking at the time that this was surely only the beginning of a huge flood of such events, the social rules of email having grown and developed far more slowly than its rapidly-exploding use. A similar (though far less embarrassing) event has now overtaken the conservative power couple, David Frum and Danielle Crittenden. Frum, a presidential speechwriter, is allegedly responsible for a well-known phrase spoken in the recent State of the Union address, and his proud wife decided to confide the triumph to some friends and family--one of whom, it seems, had all the discretion of Bradley Chait's friend. The result: an item by Timothy Noah in the online magazine Slate, prompting an angry scolding from Andrew Sullivan for intruding on Crittenden's privacy.
Now, it's impossible to say, really, whether Noah crossed a line in publishing this detail; after all, the lines simply haven't been drawn yet. What does one do if one comes across a compromising forwarded email? Is it the sender's fault for not choosing trustworthy recipients? Is the perfidious "first forwarder" the real villain? Or is there sin only in transferring the knowledge from the unofficial world of forwarded email to the official one of for-the-record publishing? Does a Website (or blog) count as such? Does it matter whether the original sender is a public figure? Or whether the contents involve public affairs or purely private ones?
I don't know how society will end up resolving all these questions, but I strongly suspect we'll end up mulling over many more such incidents before our collective judgment is passed. And then we can proceed to the next twist: suppose the alleged sender claims the original email was forged....
Tuesday, February 05, 2002
The Houston Astros naturally want to rename Enron Field, but the bankrupt company apparently just doesn't want to let go. Perhaps there's room for compromise, though--anybody for "Enron Memorial Stadium"? How about "The Skilling Fields"? Or "Lay Sod"? "Acc. Arthur Park"? Maybe "False Hope Diamond"? The possibilities are endless....
Slate's Anne Applebaum argues that the US should not be so quick to forsake multilateralism in the war on terror, because it needs European cooperation on the police, financial and propaganda fronts. But such cooperation is not seriously in doubt, given that it is so obviously in Europe's own self-interest. The only real question is how much over-the-top hypocritical griping the US will have to endure in the meantime.
The key transatlantic lesson of the Cold War, lest we forget, is that Europe resembles a tempestuous girlfriend who enjoys flirting with the enemy and getting under America's skin as often as possible--short of actually provoking the US to abandon her to her own devices. US unilateralism is thus, in the end, preferable for everyone; the Americans do what needs to be done, and the Europeans get to whine brazenly about the unrefined American brutishness that keeps them so safe. On the other hand, in those few cases when multilateralism turns out to be necessary, Europe will first work herself into a lather of tortured ambivalence before eventually doing what she knows she must (hosting American intermediate-range missiles, for instance, or rounding up Islamist terrorist cells). And even so, she will never quite forgive America for making her admit that that's what she wanted and needed all along.
The key transatlantic lesson of the Cold War, lest we forget, is that Europe resembles a tempestuous girlfriend who enjoys flirting with the enemy and getting under America's skin as often as possible--short of actually provoking the US to abandon her to her own devices. US unilateralism is thus, in the end, preferable for everyone; the Americans do what needs to be done, and the Europeans get to whine brazenly about the unrefined American brutishness that keeps them so safe. On the other hand, in those few cases when multilateralism turns out to be necessary, Europe will first work herself into a lather of tortured ambivalence before eventually doing what she knows she must (hosting American intermediate-range missiles, for instance, or rounding up Islamist terrorist cells). And even so, she will never quite forgive America for making her admit that that's what she wanted and needed all along.
Sunday, February 03, 2002
I don't watch fictional television, but judging by the promos I've seen lately, there seems to be a strange trend in dramatic series. It started with the tough-talking lawyer shows--L.A. Law and its imitators (The Practice, The Guardian, Philly, Family Law), and continued with "JAG" (tough-talking military lawyers), "CSI" (tough-talking forensic lab technicians), "Boston Public" (tough-talking schoolteachers), "First Monday" (tough-talking Supreme Court justices), and now "The American Embassy", which I gather is a show about tough-talking State Department bureaucrats. Coming soon: "Norbert Smedley, Investigative Tax Accountant"....
Saturday, February 02, 2002
I have enormous sympathy for irate New Yorkers (first, for having to live in New York, but also for) having to deal with a huge throng of ignorant, immature, misbehaving kids marching through their streets and disrupting the sense of public security that was only recently returning to that city after Sept. 11th. But I wonder, reading a bitter New Yorker's angry rant at the WEF protesters, if she and her fellow defenders of public order notice how much they sound like their counterparts of thirty-five years ago, venting their fury at the far more dangerous and disruptive protesters of an earlier era--and how much those now-celebrated protesters actually resembled the current ones in their political confusion and irresponsible disruptiveness.
Boomers like to think, of course, that there's a clear distinction between their own youthfully idealistic fight for a better world against narrow-minded defenders of a corrupt status quo, on the one hand, and today's rowdy hoodlums disturbing the public peace and spouting radical nonsense, on the other. (And being the dominant demographic group throughout their lives, they get to decide society's--and hence history's--verdict in each case.) But the striking parallels between the eras demonstrate that the true difference turns largely on the perspective of age; society's middle-aged, middle-class solid citizens will inevitably react more negatively to the threat of social disorder from radical youths than will its rebellious, coddled teenagers.
After all, it's true of America in 1967 and 2002, with the same people in both roles. That's about as close to empirical proof of a sociological claim by controlled experiment as we'll ever get.
Boomers like to think, of course, that there's a clear distinction between their own youthfully idealistic fight for a better world against narrow-minded defenders of a corrupt status quo, on the one hand, and today's rowdy hoodlums disturbing the public peace and spouting radical nonsense, on the other. (And being the dominant demographic group throughout their lives, they get to decide society's--and hence history's--verdict in each case.) But the striking parallels between the eras demonstrate that the true difference turns largely on the perspective of age; society's middle-aged, middle-class solid citizens will inevitably react more negatively to the threat of social disorder from radical youths than will its rebellious, coddled teenagers.
After all, it's true of America in 1967 and 2002, with the same people in both roles. That's about as close to empirical proof of a sociological claim by controlled experiment as we'll ever get.
It's understandable that conservatives would indulge in a certain amount of gloating over the Bellesiles controversy; when a claimed debunking of an important conservative premise (that guns have been deeply embedded in American culture from the very beginning) is revealed to be fraudulent, a touch of smugness on their part may even be in order. But one taunt that rings slightly false is the one echoed most recently by Ronald Radosh: that historians were negligent in heaping accolades on the book without carefully checking its sources. After all, Bellesiles' book, Arming America, was only published in 2000, and two years is a relatively short time in the world of academic assessment. Moreover, when a researcher presents what appears to be an extremely meticulously researched work, peers are understandably inclined to assume that the masses of accompanying documentation, much of it obscure and hard to track down, are not in fact an elaborately constructed fabrication. Finally, when suspicious investigators did eventually recognize significant discrepancies in Bellesiles' list of claimed sources, prominent Historians stepped in to evaluate the charges, and several have now pronounced Bellesiles culpable. In all likelihood, his career is effectively ruined.
Compare this sequence of events with what occurred when researcher David Stoll discovered three years ago that the autobiography of Nobel Prize-winning Guatemalan activist Rigoberta Menchu was rife with outright falsehoods. In that case, not only had the blatant lies been left unexamined for a full sixteen years after the book's publication, but even when they were exposed, numerous scholars dismissed or downplayed the claimed inaccuracies as irrelevant, and defended the memoir as true in spirit if not in detail. Viewed in the context of the Menchu affair, the academic reaction to the Bellesiles case is almost enough to foster hope for the future of integrity in scholarship.
More likely, though, it is a demonstration of the difference, not between now and 1999, but between the fields of American history and Latin American studies. In America, the only people who would ever have had the slightest interest in Rigoberta Menchu's ghostwritten doctrinaire screed in the first place would have been committed leftist political activists and their academic allies, most or all of them disinclined to let a mere detail like factual inaccuracy interfere with their political agenda. Early American history, on the other hand, has a following audience of millions of teachers, students, readers, amateurs and generally interested people who actually care about what happened, and to whom credentialed historians must justify themselves, to some extent, in order to retain their reputations in the field. In such an environment, identification and punishment of scholarly misdeeds is far more likely to be swift, effective, and supported by consensus.
The story of the decline of liberal arts scholarship over the last thirty years is the story of a collection of academic disciplines bereft of purpose slowly sinking into the mire of aimless, meaningless, contentless self-absorption. Against these stand the few exceptional fields in which genuine demand from without fuels serious scholarly activity and motivates rigor and originality. L'affaire Bellesiles, far from giving the profession a black eye, actually demonstrates that American History is one of those areas still in the latter camp, and that academic disciplines, given a little bit of external oversight to keep them from yielding to temptation (and enough time to self-correct), can sometimes behave quite honorably.
Compare this sequence of events with what occurred when researcher David Stoll discovered three years ago that the autobiography of Nobel Prize-winning Guatemalan activist Rigoberta Menchu was rife with outright falsehoods. In that case, not only had the blatant lies been left unexamined for a full sixteen years after the book's publication, but even when they were exposed, numerous scholars dismissed or downplayed the claimed inaccuracies as irrelevant, and defended the memoir as true in spirit if not in detail. Viewed in the context of the Menchu affair, the academic reaction to the Bellesiles case is almost enough to foster hope for the future of integrity in scholarship.
More likely, though, it is a demonstration of the difference, not between now and 1999, but between the fields of American history and Latin American studies. In America, the only people who would ever have had the slightest interest in Rigoberta Menchu's ghostwritten doctrinaire screed in the first place would have been committed leftist political activists and their academic allies, most or all of them disinclined to let a mere detail like factual inaccuracy interfere with their political agenda. Early American history, on the other hand, has a following audience of millions of teachers, students, readers, amateurs and generally interested people who actually care about what happened, and to whom credentialed historians must justify themselves, to some extent, in order to retain their reputations in the field. In such an environment, identification and punishment of scholarly misdeeds is far more likely to be swift, effective, and supported by consensus.
The story of the decline of liberal arts scholarship over the last thirty years is the story of a collection of academic disciplines bereft of purpose slowly sinking into the mire of aimless, meaningless, contentless self-absorption. Against these stand the few exceptional fields in which genuine demand from without fuels serious scholarly activity and motivates rigor and originality. L'affaire Bellesiles, far from giving the profession a black eye, actually demonstrates that American History is one of those areas still in the latter camp, and that academic disciplines, given a little bit of external oversight to keep them from yielding to temptation (and enough time to self-correct), can sometimes behave quite honorably.
Friday, February 01, 2002
"'You asked me once,' said O'Brien, 'what was in Room 101. I told you that you knew the answer already. Everyone knows it. The thing that is in Room 101 is the worst thing in the world.'....
"'The worst thing in the world,' said O'Brien, 'varies from individual to individual. It may be burial alive, or death by fire, or by drowning, or by impalement, or fifty other deaths. There are cases where it is some quite trivial thing, not even fatal.'"
-- Orwell, 1984
In the Weekly Standard, Victorino Matus makes an interesting observation about the interrogation of Pakistani terrorist suspect (later convicted) Hakim Abdul Murad in the Philippines in 1995. Apparently, the man withstood incredibly brutal physical torture, but broke completely when threatened with being turned over to....the Mossad.
Now I don't know much about the Mossad's interrogation techniques (nor, I'm sure, did Murad); in fact, I doubt that they do much interrogating at all--questioning Palestinian suspects, for example, is primarily the job of Israel's internal security service, the Shin Bet. But the idea that the Mossad is such a feared organization among Islamist terrorists raises an interesting question about the psychology of terrorism: is the irrational, violent hatred displayed by terrorists the flip-side of an irrational, paralyzing fear? If so, what is it, exactly, about Israel that Hakim Abdul Murad and his colleagues fear so much? And can it be exploited in the effort to defeat them?
"'The worst thing in the world,' said O'Brien, 'varies from individual to individual. It may be burial alive, or death by fire, or by drowning, or by impalement, or fifty other deaths. There are cases where it is some quite trivial thing, not even fatal.'"
-- Orwell, 1984
In the Weekly Standard, Victorino Matus makes an interesting observation about the interrogation of Pakistani terrorist suspect (later convicted) Hakim Abdul Murad in the Philippines in 1995. Apparently, the man withstood incredibly brutal physical torture, but broke completely when threatened with being turned over to....the Mossad.
Now I don't know much about the Mossad's interrogation techniques (nor, I'm sure, did Murad); in fact, I doubt that they do much interrogating at all--questioning Palestinian suspects, for example, is primarily the job of Israel's internal security service, the Shin Bet. But the idea that the Mossad is such a feared organization among Islamist terrorists raises an interesting question about the psychology of terrorism: is the irrational, violent hatred displayed by terrorists the flip-side of an irrational, paralyzing fear? If so, what is it, exactly, about Israel that Hakim Abdul Murad and his colleagues fear so much? And can it be exploited in the effort to defeat them?
Subscribe to:
Posts (Atom)