Andrew Sullivan is baffled that supposedly tough soldiers seem so skittish about the idea of accepting the presence of openly gay men in their midst. "Is it because they're afraid of being raped?", he asks. "C'mon. Assuming all gay men - or even any - are potential rapists is completely loopy. (And the same people who make this bizarre argument would scoff at a woman who screamed rape if a man looked at her in a sexually interested way.)"
Oddly enough, Sullivan doesn't evince the slightest mystification over the elaborate lengths to which the military goes to protect women--tough, hardened military women, mind you--from invasions of their privacy by men. Why aren't military showers, bunks and latrines co-ed? And why all the draconian rules against "fraternization" and so on? Are these women soldiers afraid of being raped by their well-disciplined colleagues? Are they afraid of being looked at by men with lust in their hearts?
Well, yes, actually--and understandably so. One doesn't have to believe that all straight men are rapists, or that the male sexual gaze is inherently brutalizing, to understand why women (in this culture, at least) feel unsafe bathing naked around male soldiers (or groups of them). It doesn't matter if most of the time, nothing untoward happens; it only takes one major incident--or a long-enough sequence of small, subtle, incremental steps--for all assumptions of safety to break down completely. (If you're a weakling to be bothered by glances, after all, then what about playful pats on the shoulder? Or elsewhere? Where does the line get drawn? How? And by whom?)
The instinctive anticipation of this threat of sudden breach or gradual erosion of personal safety is likely the source of that general feeling of discomfort that causes women to want to guard their privacy from men when forced into close quarters with them. It shouldn't be surprising, then, that men would want to take the same precautions with respect to gay men; after all, gay men may not be substantially worse than straight men in this regard, but there's no reason to believe they're any better (and, as straight men themselves are well aware, that's plenty bad enough, in the worst cases).
Of course, military training is designed to break down instinctive anticipations and general feelings of discomfort and replace them with rigorous discipline; and if it turned out one day to be militarily necessary to drill soldiers to get over their discomfort around gay comrades the way they get over, say, terror of enemy fire, then the army would no doubt do what had to be done. But each such psychic hardship imposed on soldiers exacts a toll, and for the military to wish to avoid an avoidable one, so as to be able to concentrate on the unavoidable ones, is hardly a demonstration of bigotry or cowardice. Rather, it demonstrates a recognition of, and respect for, the limits and costs of discipline, and an unwillingness to bury or deny those costs in order to indulge various varieties of political dogma.
Tuesday, November 19, 2002
In Ha'aretz, Danny Rubinstein argues that Palestinians would actually prefer a right-wing victory in the upcoming elections, because Sharon "did not succeed in reducing the violence or stopping the terrorist attacks", and his continued rule "is the only way Israelis will learn how powerless the right really is - and may in turn germinate the seeds of a just settlement." He may well be correct; the problem arises when one considers just how the Palestinians define a "just settlement". After all, if the definition looked anything like the permanent settlement Ehud Barak offered--or even the unilateral withdrawal being proposed by Labor leadership candidate Amram Mitzna--then surely the Palestinians would be rooting for a Mitzna victory. If Rubinstein is right, then, the Palestinian notion of a "just settlement" must be a much more far-reaching capitulation than even Labor doves are willing to contemplate--that is, something that most Israelis (understandably) consider tantamount to acquiescing in Israel's complete destruction.
I asserted a few months ago that according to "the solid majority view" among Israelis, they are already "the undeclared winning side in the conflict" of the last two years with the Palestinians. But there is a subtle assumption buried in that view: that Israel can continue to impose and even tighten its crackdown on the occupied territories indefinitely, until its residents eventually stop seeing their suffering as a worthwhile price to pay for maintaining their low-level campaign of terrorism against Israel. This assumption may be correct; but it's also possible that the current willingness of (according to polls) a majority of Palestinians to endure hardship of the worst sort, just for the sake of persevering in their efforts to kill as many Jews as possible, will continue for years to come. In that case, Israel faces a long period of walking on an extremely slippery tightrope between, on the one hand, indulging the temptation to resort to extreme cruelty in an attempt to hasten the moment of Palestinian abandonment of terrorism, and on the other, indulging the temptation to forget--as so many Israelis did from 1993 to 2000--that conciliatory concessions to a polity that enthusiastically embraces mass murder are ultimately suicidal.
I asserted a few months ago that according to "the solid majority view" among Israelis, they are already "the undeclared winning side in the conflict" of the last two years with the Palestinians. But there is a subtle assumption buried in that view: that Israel can continue to impose and even tighten its crackdown on the occupied territories indefinitely, until its residents eventually stop seeing their suffering as a worthwhile price to pay for maintaining their low-level campaign of terrorism against Israel. This assumption may be correct; but it's also possible that the current willingness of (according to polls) a majority of Palestinians to endure hardship of the worst sort, just for the sake of persevering in their efforts to kill as many Jews as possible, will continue for years to come. In that case, Israel faces a long period of walking on an extremely slippery tightrope between, on the one hand, indulging the temptation to resort to extreme cruelty in an attempt to hasten the moment of Palestinian abandonment of terrorism, and on the other, indulging the temptation to forget--as so many Israelis did from 1993 to 2000--that conciliatory concessions to a polity that enthusiastically embraces mass murder are ultimately suicidal.
Sunday, November 17, 2002
NEW YORK (ICBW) -- In the wake of the spectacular opening-weekend success of the latest Harry Potter film, "Harry Potter and the Chamber of Secrets", authorities are bracing for the likely consequence: a spate of children injuring themselves while imitating Potter's magic feats. "We want to warn the public that magic is a dangerous business," said Albus Dumbledore, headmaster of Hogwarts Academy of Witchcraft and Wizardry, and a technical consultant to the filmmakers. "Untrained muggle children shouldn't even attempt to dabble in it."
After the release of the first film in the series, "Harry Potter and the Sorcerer's Stone", numerous young viewers sustained injuries uttering backfired spells and playing quidditch with dangerously underpowered homemade brooms. This time, there have already been scattered reports of flying-car accidents and careless petrifications. "The magic stunts performed in the film all involved qualified Hogwarts-trained professionals," explained Dumbledore. "But many youngsters see a group of child actors appearing to use powerful spells and potions, and figure, 'hey, I can do that.'"
In preparation for the film's opening, hospitals throughout the US have stocked up on mandrake root and phoenix tears, and Dumbledore said his staff will be available around the clock to handle emergencies. "But the best precaution," he reminds viewers, "is to stick to non-magical pursuits. It's funny, really--if our own students had a choice, most of them would neglect their magic completely and spend all day playing video games and chattering on their blasted cellphones."
After the release of the first film in the series, "Harry Potter and the Sorcerer's Stone", numerous young viewers sustained injuries uttering backfired spells and playing quidditch with dangerously underpowered homemade brooms. This time, there have already been scattered reports of flying-car accidents and careless petrifications. "The magic stunts performed in the film all involved qualified Hogwarts-trained professionals," explained Dumbledore. "But many youngsters see a group of child actors appearing to use powerful spells and potions, and figure, 'hey, I can do that.'"
In preparation for the film's opening, hospitals throughout the US have stocked up on mandrake root and phoenix tears, and Dumbledore said his staff will be available around the clock to handle emergencies. "But the best precaution," he reminds viewers, "is to stick to non-magical pursuits. It's funny, really--if our own students had a choice, most of them would neglect their magic completely and spend all day playing video games and chattering on their blasted cellphones."
Thursday, November 14, 2002
Mark Kleiman writes about the harrowing story of thimerosal, a mercury-based vaccine additive that some suspect is responsible for the epidemic of autism that appears to have broken out in California in the 1990s. Republican Congressman Dick Armey has slipped a provision into the new Homeland Security bill that, according to the New York Times, "was apparently intended to protect Eli Lilly, the pharmaceutical giant, from lawsuits over thimerosal". All very shocking--until, that is, one checks out the CDC's position on thimerosal....
Of course, I'm no toxicology expert, and if anybody can point me to actual, substantial evidence implicating thimerosal in vaccines as a cause of serious health problems, I'd be interested to hear about it. Perhaps, though, the CDC ought to be informed first.
There is no evidence of harm caused by the minute doses of thimerosal in vaccines, except for minor effects like swelling and redness at the injection site due to sensitivity to thimerosal.Now, I hold no brief for Dick Armey, and I don't care at all for special-favor clauses being sneaked into important legislation. But in the absence of proper tort reform (and in the presence of widespread hysteria about technology--including lifesaving technologies like vaccines), I would guess that it's more likely that this particular political move will end up saving lives (by impeding the onslaught of tort lawyers and junk-science scaremongers on the practice of universal vaccination) than that it will actually harm anyone.
Of course, I'm no toxicology expert, and if anybody can point me to actual, substantial evidence implicating thimerosal in vaccines as a cause of serious health problems, I'd be interested to hear about it. Perhaps, though, the CDC ought to be informed first.
Tuesday, November 12, 2002
The campaign to get universities to divest themselves of Israeli investments is heating up; both supporters and opponents are comparing it to the South Africa divestment campaign of the 1980's. And both sides are more right than they realize.
Of course, Israel is nothing like South Africa was. It's a full democracy with a universal franchise, not at all like the apartheid regime. It has recently engaged in a multi-year process of trying to set up the territories it occupies (as a result of a war provoked by legitimate casus belli) as an independent state, being stymied only by the refusal of the prospective government of that state to abandon terrorism against Israelis. Its own citizens are neither racially segregated nor otherwise politically oppressed, and have the full range of democratic freedoms, including speech and religion.
But then, South Africa wasn't a particularly obvious choice of target, either. It was hardly the worst human rights abuser of the era, even on its own continent. Opponents of the boycott routinely pointed out that South African Blacks were better off than they would have been in just about any other country in Africa, and the boycott itself caused no small amount of suffering among them. If one were to choose a political evil to target in the 1980's based on moral and humanitarian considerations, Apartheid would have been a legitimate but relatively minor choice, paling by comparison with literally dozens of others.
But that's the dirty little secret of politically motivated boycotts: they are not primarily chosen on the strength of their justifications or the urgency of their goals. Rather, their adherents participate in the hope of making a political point in some other, entirely separate context. The South African boycott, in truth, was about many things--race relations in the US, Cold War geopolitics in the Third World, and anti-corporate populism, to name three--but the actual conditions of non-White South Africans were at best peripheral. Likewise, today's university divestment campaigns have many motivations--"anti-globalist" leftism, anti-Americanism, even, on the fringes, some anti-Semitism--but sincere concern for the plight of the Palestinians (for the vast majority of whom the Oslo process has been an unmitigated disaster that a boycott of Israel would likely only further exacerbate) can't be very high on the list. It is fortunate that some major academic leaders are seeing through the sophistries and rejecting the divestment movement's meretricious moral case.
Of course, Israel is nothing like South Africa was. It's a full democracy with a universal franchise, not at all like the apartheid regime. It has recently engaged in a multi-year process of trying to set up the territories it occupies (as a result of a war provoked by legitimate casus belli) as an independent state, being stymied only by the refusal of the prospective government of that state to abandon terrorism against Israelis. Its own citizens are neither racially segregated nor otherwise politically oppressed, and have the full range of democratic freedoms, including speech and religion.
But then, South Africa wasn't a particularly obvious choice of target, either. It was hardly the worst human rights abuser of the era, even on its own continent. Opponents of the boycott routinely pointed out that South African Blacks were better off than they would have been in just about any other country in Africa, and the boycott itself caused no small amount of suffering among them. If one were to choose a political evil to target in the 1980's based on moral and humanitarian considerations, Apartheid would have been a legitimate but relatively minor choice, paling by comparison with literally dozens of others.
But that's the dirty little secret of politically motivated boycotts: they are not primarily chosen on the strength of their justifications or the urgency of their goals. Rather, their adherents participate in the hope of making a political point in some other, entirely separate context. The South African boycott, in truth, was about many things--race relations in the US, Cold War geopolitics in the Third World, and anti-corporate populism, to name three--but the actual conditions of non-White South Africans were at best peripheral. Likewise, today's university divestment campaigns have many motivations--"anti-globalist" leftism, anti-Americanism, even, on the fringes, some anti-Semitism--but sincere concern for the plight of the Palestinians (for the vast majority of whom the Oslo process has been an unmitigated disaster that a boycott of Israel would likely only further exacerbate) can't be very high on the list. It is fortunate that some major academic leaders are seeing through the sophistries and rejecting the divestment movement's meretricious moral case.
Monday, November 11, 2002
The Canadian government's occasionally impolitic positions with respect to the war on terrorism, to which I've alluded previously, and which are the subject of a piece by Jonah Goldberg in the National Review, may be somewhat puzzling to Americans. I will try to explain (without excusing) them; the explanation may also provide useful insight into some other Western countries' strangely unsupportive attitude towards American efforts against terrorism.
The first thing that Americans should understand about Canadian politics is that Canadians are, by and large, a politically uncommitted bunch. Only a tiny fraction of the population belongs to a political party, and most of the rest are happy to vote for whichever party seems to be addressing the pocketbook issues of the day (or to be winning handily enough to be worth currying favor with, in the hopes of receiving a greater share of federal pork-barrel spending after the election). To the extent that there is any mass partisanship in federal politics, it is largely a matter of inter-regional conflict, with parties increasingly representing their regional power bases. Canadian foreign policy is simply not on the political radar screen, as Canadian voters understand perfectly well their country's utter insignificance in the geopolitical arena.
As a result, Canadian governments target their foreign policy largely at the small domestic constituency that actually cares about it. Naturally, this group disproportionately inhabits the academic and media worlds, where it clings, like its American and European counterparts, to a familiar breed of woolly-minded leftish anti-Americanism with the dogmatic uniformity typical of small, concentrated intellectual groups. Canadian journalists and academics are also somewhat self-selected for anti-Americanism, since the most successful among them usually have the option of enhancing their prestige and paychecks south of the border--an option many of them exercise, unless they are strongly disinclined to do so--and the less successful thus have ample cause for "sour grapes" resentment of an American cultural and intellectual pre-eminence that excludes them.
There is also a strain of anti-Americanism that runs through most segments of Canadian society, and that has little to justify it beyond common "us vs. them" home-team-rooting. It's not particularly intense or virulent, though, and it's counterbalanced by Canadians' general sense of neighborly good feeling towards folks south of the border. (A large fraction, after all, have friends or relatives in the US, visit often, and are deeply immersed in popular culture. Pernicious stereotypes about American national characteristics are hard to sustain under those conditions; one has to live among Americans for years, as I have, to develop them.) But on issues that don't really matter (and let's face it: what Canadian politicians have to say about world affairs almost never really matters), playing to anti-American peevishness rarely causes a politician lasting damage.
On matters of substance--i.e., action--though, I believe that a solid majority of Canadians invariably stand firmly with their American allies. They helped house stranded American travelers on September 11th, when American flights were grounded; their soldiers joined the US in the Afghanistan campaign; and they continue to cooperate with their neighbors on a variety of continent-wide security matters. The longest undefended border in the world will no doubt remain undefended--and friendly--for a long time to come.
The first thing that Americans should understand about Canadian politics is that Canadians are, by and large, a politically uncommitted bunch. Only a tiny fraction of the population belongs to a political party, and most of the rest are happy to vote for whichever party seems to be addressing the pocketbook issues of the day (or to be winning handily enough to be worth currying favor with, in the hopes of receiving a greater share of federal pork-barrel spending after the election). To the extent that there is any mass partisanship in federal politics, it is largely a matter of inter-regional conflict, with parties increasingly representing their regional power bases. Canadian foreign policy is simply not on the political radar screen, as Canadian voters understand perfectly well their country's utter insignificance in the geopolitical arena.
As a result, Canadian governments target their foreign policy largely at the small domestic constituency that actually cares about it. Naturally, this group disproportionately inhabits the academic and media worlds, where it clings, like its American and European counterparts, to a familiar breed of woolly-minded leftish anti-Americanism with the dogmatic uniformity typical of small, concentrated intellectual groups. Canadian journalists and academics are also somewhat self-selected for anti-Americanism, since the most successful among them usually have the option of enhancing their prestige and paychecks south of the border--an option many of them exercise, unless they are strongly disinclined to do so--and the less successful thus have ample cause for "sour grapes" resentment of an American cultural and intellectual pre-eminence that excludes them.
There is also a strain of anti-Americanism that runs through most segments of Canadian society, and that has little to justify it beyond common "us vs. them" home-team-rooting. It's not particularly intense or virulent, though, and it's counterbalanced by Canadians' general sense of neighborly good feeling towards folks south of the border. (A large fraction, after all, have friends or relatives in the US, visit often, and are deeply immersed in popular culture. Pernicious stereotypes about American national characteristics are hard to sustain under those conditions; one has to live among Americans for years, as I have, to develop them.) But on issues that don't really matter (and let's face it: what Canadian politicians have to say about world affairs almost never really matters), playing to anti-American peevishness rarely causes a politician lasting damage.
On matters of substance--i.e., action--though, I believe that a solid majority of Canadians invariably stand firmly with their American allies. They helped house stranded American travelers on September 11th, when American flights were grounded; their soldiers joined the US in the Afghanistan campaign; and they continue to cooperate with their neighbors on a variety of continent-wide security matters. The longest undefended border in the world will no doubt remain undefended--and friendly--for a long time to come.
Sunday, November 10, 2002
Thomas Friedman's at it again. His latest column says a lot of really silly things--such as (I'm not making this up) that the "Bush hard-liners" who hope to invade Iraq and topple Saddam Hussein don't "really want to invest in making the world a different place, or....have any imagination or inspiration to do so". (Wiser souls, he explains, appreciate the far greater world-changing power of--I'm not kidding--"diplomacy".) But his introductory paragraph repeats a canard whose absurdity will be apparent, sadly, to all too few readers. He cites, with approval, a "senior European diplomat" who complained that the Bush administration is failing to tell Israel that it "needs to find a secure way to get out of the settlements."
Now, it may well be that an Israeli-Palestinian peace agreement will one day be signed, one clause of which involves the evacuation of most or all of the Jews living in the West Bank and Gaza. (I'm skeptical--it seems unlikely that Palestinians would be willing to live at peace with a Jewish state a few miles away, but not with a Jewish neighborhood on the next hill--but I suppose it's still possible.) However, it doesn't make a lot of sense to start talking to the Israelis about dismantling settlements in the occupied territories when they're currently militarily occupying almost all the West Bank's major cities. As for the Palestinians, they conspicuously do not refer to the violence that first erupted in September 2001 as the "Settlements Intifada", were not responding at the time--even as a pretext--to a visit by Ariel Sharon to a settlement, never refer to removal of the settlements as their primary goal, and do not discriminate between civilians living in Israel proper and those in the territories when executing their terrorist attacks. Nor is there any indication that the failure of the Camp David and Taba negotiations hinged in any significant way on the settlements--an issue on which the Israelis were in fact quite flexible, to no avail.
The real importance of the settlements, though, lies in the role they play in Friedman's worldview and that of his European diplomatic friends. For them, it is crucial that they find something that Israel must be cajoled into conceding--otherwise, their negotiations-based strategy is self-evidently doomed to fecklessness, given the Palestinians' refusal to bow even to harsh Israeli military pressure, let alone to mere diplomatic pestering. But what Israeli concession can they possibly portray as a key goal for the diplomats? Military restraint is a non-starter, from Israel's point of view, since it's been repeatedly and amply proven to be of no use whatsoever in winning Palestinian reciprocity. Likewise, most of the generous long-term offers spurned at Camp David, such as full statehood and compromise on Jerusalem, have lost all their plausibility as bargaining chips in the eyes of Israelis.
Settlement-dismantling, on the other hand, is not a completely quixotic goal; it still retains a modest constituency within the Israeli body politic, mostly for various internal political reasons. Hence, if Friedman et al. can (mis-) represent settlement evacuation as the potential breakthrough step in a process of mutual compromise, then--voila!-- they can claim a vital role for diplomacy in resolving the conflict. It's a slender reed, to be sure; but it's the only one available, and without it, they would literally have no justification, however feeble, for trying to insinuate a diplomatic component into Israel's muscular (and comparatively far more effective) response to terrorism.
Now, it may well be that an Israeli-Palestinian peace agreement will one day be signed, one clause of which involves the evacuation of most or all of the Jews living in the West Bank and Gaza. (I'm skeptical--it seems unlikely that Palestinians would be willing to live at peace with a Jewish state a few miles away, but not with a Jewish neighborhood on the next hill--but I suppose it's still possible.) However, it doesn't make a lot of sense to start talking to the Israelis about dismantling settlements in the occupied territories when they're currently militarily occupying almost all the West Bank's major cities. As for the Palestinians, they conspicuously do not refer to the violence that first erupted in September 2001 as the "Settlements Intifada", were not responding at the time--even as a pretext--to a visit by Ariel Sharon to a settlement, never refer to removal of the settlements as their primary goal, and do not discriminate between civilians living in Israel proper and those in the territories when executing their terrorist attacks. Nor is there any indication that the failure of the Camp David and Taba negotiations hinged in any significant way on the settlements--an issue on which the Israelis were in fact quite flexible, to no avail.
The real importance of the settlements, though, lies in the role they play in Friedman's worldview and that of his European diplomatic friends. For them, it is crucial that they find something that Israel must be cajoled into conceding--otherwise, their negotiations-based strategy is self-evidently doomed to fecklessness, given the Palestinians' refusal to bow even to harsh Israeli military pressure, let alone to mere diplomatic pestering. But what Israeli concession can they possibly portray as a key goal for the diplomats? Military restraint is a non-starter, from Israel's point of view, since it's been repeatedly and amply proven to be of no use whatsoever in winning Palestinian reciprocity. Likewise, most of the generous long-term offers spurned at Camp David, such as full statehood and compromise on Jerusalem, have lost all their plausibility as bargaining chips in the eyes of Israelis.
Settlement-dismantling, on the other hand, is not a completely quixotic goal; it still retains a modest constituency within the Israeli body politic, mostly for various internal political reasons. Hence, if Friedman et al. can (mis-) represent settlement evacuation as the potential breakthrough step in a process of mutual compromise, then--voila!-- they can claim a vital role for diplomacy in resolving the conflict. It's a slender reed, to be sure; but it's the only one available, and without it, they would literally have no justification, however feeble, for trying to insinuate a diplomatic component into Israel's muscular (and comparatively far more effective) response to terrorism.
Monday, November 04, 2002
A few months ago, Slate somehow managed to cajole the unlucky Virginia Heffernan into subjecting herself to an episode of HBO's "Real Sex" television series and then reporting her impressions to readers. Her staggering conclusion: the program's "soft sociology provide[s] an excuse to look at soft pornography". The article must have drawn a high hit count, though, because now they've sent Emily Nussbaum to check out New York's Museum of Sex. While Ms. Nussbaum is actually fairly upbeat, describing one exhibit as "an impressive combination of titillating and educational", she can't quite disguise the museum's real target demographic, admitting that the guards "have to shoo people away" from the hardcore stuff. Odds are that most of the patrons "transformed....into zombies" by the "lurid close-up genital pistons of a Dolores Del Rio porn film" don't share Nussbaum's gender.
Why, then, did Slate twice send a woman to do a man's job? Perhaps because an honest male assessment of this ever-so-slightly-dressed-up smut would have to be brutally frank about its producers' obvious goals in presenting it--and hence, by implication, about Slate's obvious goals in reviewing it.
Why, then, did Slate twice send a woman to do a man's job? Perhaps because an honest male assessment of this ever-so-slightly-dressed-up smut would have to be brutally frank about its producers' obvious goals in presenting it--and hence, by implication, about Slate's obvious goals in reviewing it.
Sunday, November 03, 2002
Lisa Dusseault tells an amusing story about an encounter with Canadian tourists in San Francisco who, on hearing that she is an expatriate Canadian living in the US, commiserate with her plight. I understand her nonplussed reaction; I must at least partially disagree, however, with her claim that "it's not so different" living in the US, as opposed to Canada.
In one sense, she's clearly correct: many of the horror stories that Canadians tell each other (and themselves) about America are built around serious misunderstandings of their southern neighbor. I can say this with some authority, since I labored under several such illusions myself while growing up in Canada, and was only thoroughly disabused of them when I moved to the US. For example, Canadians hear about the horrific American crime rate, and assume that life in America is a daily crap-shoot for survival. In fact, the means to insulate oneself from crime--peaceful, safe suburban neighborhoods and well-protected shopping areas, workplaces and recreational districts--are readily available (and affordablly accessible) to a very large fraction of Americans. Like Lisa, I have never been a crime victim in the US, and I certainly haven't had to spend a fortune to buy my safety, as Canadian myths would suggest.
Similarly, health care was once the canonical example of the contrast between terrifying American chaos and reassuring Canadian orderliness. Again, though, most Americans have employer-provided healthcare that provides a level of protection comparable to the standard Canadian regime (indeed, arguably superior to it, given the tales I've heard lately about the collapsing Canadian health care system). And US politics, for all its faults, is much harder to criticize these days in light of the appalling way that Canadian officials have embarrassed themselves when commenting on American foreign policy.
And then there are the attractions of American life. For example, the US has a service ethic far superior to Canada's; being a customer of any kind in America is a real joy compared with Canada's more, uh, European approach to customer care. And, as Lisa points out, economic opportunity can also be considered a quality-of-life issue: a more enjoyable, interesting, challenging job represents a lifestyle improvement above and beyond any material standard-of-living increase it may provide.
And yet...there are real cultural differences that can make the adjustment to American life difficult for a born-and-bred Canadian, even after the misconceptions have been discounted. For example, I find American interpersonal culture to be marked by a peculiar level of unabashed self-centeredness and self-indulgence--a kind of naive thoughtlessness about the feelings and concerns of others when it interferes with one's own "pursuit of happiness". I'm not making a political statement here, or alluding to any grand philosophical principle; rather, I'm speaking of the very fabric of day-to-day American social interaction. I once had lunch in a restaurant in California with a group that included a German friend; when this friend needed to squeeze behind another diner's chair at the next table in order to leave the restaurant, the other diner, rather than shift his chair to allow my friend to leave, continued his conversation for several minutes, happily oblivious to the buttocks mere inches from the back of his head. Eventually my polite friend was forced to bring his problem explicitly to the gentleman's attention, at which point he was happy to assist by shifting his chair forward slightly. "Only in America", muttered my friend after we had left. I believe he's right; many other cultures tolerate behavior that North Americans might consider deliberately rude, but only in the US is it unsurprising that an otherwise non-hostile person would so egregiously fail, in all innocence, to take others' concerns into consideration at all. To someone raised on diffident Canadian politeness, the adjustment to this American-style solipsism can be difficult.
There are other differences, as well. America is a much more class-conscious society than Canada, in which people are keenly aware of markers of social status. (I can't imagine a Canadian, for instance, conspicuously dropping blatant, smug references to his or her alma mater, the way many ivy league-educated Americans seem to--sometimes literally within minutes of meeting me.) On the plus side, the level of diligence, industry and entrepreneurialism in the US far outstrips the Canadian norm. (The experience of shame at one's own laziness also requires some adjustment, as it turns out.)
None of these differences is in itself particularly taxing to deal with, of course; nor do they, taken together, justify receiving condolences from visiting fellow Canadians. But they do cause me to miss, on occasion, the country of my birth--and to experience a certain feeling of warm comfort on each return visit to the country I still think of, in a way, as home.
In one sense, she's clearly correct: many of the horror stories that Canadians tell each other (and themselves) about America are built around serious misunderstandings of their southern neighbor. I can say this with some authority, since I labored under several such illusions myself while growing up in Canada, and was only thoroughly disabused of them when I moved to the US. For example, Canadians hear about the horrific American crime rate, and assume that life in America is a daily crap-shoot for survival. In fact, the means to insulate oneself from crime--peaceful, safe suburban neighborhoods and well-protected shopping areas, workplaces and recreational districts--are readily available (and affordablly accessible) to a very large fraction of Americans. Like Lisa, I have never been a crime victim in the US, and I certainly haven't had to spend a fortune to buy my safety, as Canadian myths would suggest.
Similarly, health care was once the canonical example of the contrast between terrifying American chaos and reassuring Canadian orderliness. Again, though, most Americans have employer-provided healthcare that provides a level of protection comparable to the standard Canadian regime (indeed, arguably superior to it, given the tales I've heard lately about the collapsing Canadian health care system). And US politics, for all its faults, is much harder to criticize these days in light of the appalling way that Canadian officials have embarrassed themselves when commenting on American foreign policy.
And then there are the attractions of American life. For example, the US has a service ethic far superior to Canada's; being a customer of any kind in America is a real joy compared with Canada's more, uh, European approach to customer care. And, as Lisa points out, economic opportunity can also be considered a quality-of-life issue: a more enjoyable, interesting, challenging job represents a lifestyle improvement above and beyond any material standard-of-living increase it may provide.
And yet...there are real cultural differences that can make the adjustment to American life difficult for a born-and-bred Canadian, even after the misconceptions have been discounted. For example, I find American interpersonal culture to be marked by a peculiar level of unabashed self-centeredness and self-indulgence--a kind of naive thoughtlessness about the feelings and concerns of others when it interferes with one's own "pursuit of happiness". I'm not making a political statement here, or alluding to any grand philosophical principle; rather, I'm speaking of the very fabric of day-to-day American social interaction. I once had lunch in a restaurant in California with a group that included a German friend; when this friend needed to squeeze behind another diner's chair at the next table in order to leave the restaurant, the other diner, rather than shift his chair to allow my friend to leave, continued his conversation for several minutes, happily oblivious to the buttocks mere inches from the back of his head. Eventually my polite friend was forced to bring his problem explicitly to the gentleman's attention, at which point he was happy to assist by shifting his chair forward slightly. "Only in America", muttered my friend after we had left. I believe he's right; many other cultures tolerate behavior that North Americans might consider deliberately rude, but only in the US is it unsurprising that an otherwise non-hostile person would so egregiously fail, in all innocence, to take others' concerns into consideration at all. To someone raised on diffident Canadian politeness, the adjustment to this American-style solipsism can be difficult.
There are other differences, as well. America is a much more class-conscious society than Canada, in which people are keenly aware of markers of social status. (I can't imagine a Canadian, for instance, conspicuously dropping blatant, smug references to his or her alma mater, the way many ivy league-educated Americans seem to--sometimes literally within minutes of meeting me.) On the plus side, the level of diligence, industry and entrepreneurialism in the US far outstrips the Canadian norm. (The experience of shame at one's own laziness also requires some adjustment, as it turns out.)
None of these differences is in itself particularly taxing to deal with, of course; nor do they, taken together, justify receiving condolences from visiting fellow Canadians. But they do cause me to miss, on occasion, the country of my birth--and to experience a certain feeling of warm comfort on each return visit to the country I still think of, in a way, as home.
Friday, November 01, 2002
Mark Kleiman and Eugene Volokh are apparently both of the opinion that the Boy Scouts' policy of excluding atheists amounts to a kind of religious discrimination. (Both concede the Scouts' Constitutional right to their policy, but consider it morally wrong nonetheless.) As Volokh puts it, "[i]f the Scouts excluded Catholics -- everyone else, Jewish, Protestant, or what have you is fine, but not Catholics -- we'd rightly condemn them, even if they said 'Rejection of Catholicism is one of our core beliefs.' Likewise, I think, when they exclude atheists." (Kleiman makes the same point, right down to the choice of analogy.)
I rather doubt that the Scouts' opposition to atheism is as narrowly defined in practice as Kleiman and Volokh claim; would the Scouts accept, for example, a Satanist troop, or one that worships only a particular (living) charismatic cult leader? If the Scouts turn out to have bona fide doctrinal standards compatible with most religions but exclusive of a few, then they would be no different from a group that rejected, say, believers in performing child sacrifice rituals or murdering all heretics (except, of course, in the sense that Kleiman and Volokh probably find adherents of such ideas far more worthy of exclusion than those, including atheists, who happen to offend the Scouts' somewhat woolier religious principles).
I also wonder whether the two professors have paused to consider in just what company they have placed themselves with their choice of analogy. The most conspicuous advocates of the idea that atheism is a religious conviction--comparable to, say, Catholicism--are fundamentalist Christians attempting to inject "Creationism" into the public-school curriculum. After all, if atheism is a religion, just like literalist Christianity, then it's perfectly valid to claim that Darwin's theory of evolution is as much a religious position as is the "Genesis theory" of human origins. Likewise, if absence of religion is just another religious creed, then school voucher programs that encompass confessional schools are not only Constitutionally permissible--they might even be mandatory, under the judiciary's current broad reading of the First Amendment, to prevent the government from "establishing" atheism, over all other religious doctrines, as the official "faith" of the public school system.
But atheism is not a religion; the absence of religion is very different from the presence of one. Teaching evolutionary biology in science class, and rejecting all of its religious alternatives, is not the same as teaching a single religious alternative. Forbidding the promulgation of any religion in public schools is nothing at all like exclusively promulgating a single one. And likewise the Boy Scouts, in requiring their members to affiliate with a religion--any religion--are not "excluding" a particular religion. Kleiman and Volokh should be happy about that; the consequences of atheism being designated by convention as just another religious belief would be most unlikely to please either of them.
I rather doubt that the Scouts' opposition to atheism is as narrowly defined in practice as Kleiman and Volokh claim; would the Scouts accept, for example, a Satanist troop, or one that worships only a particular (living) charismatic cult leader? If the Scouts turn out to have bona fide doctrinal standards compatible with most religions but exclusive of a few, then they would be no different from a group that rejected, say, believers in performing child sacrifice rituals or murdering all heretics (except, of course, in the sense that Kleiman and Volokh probably find adherents of such ideas far more worthy of exclusion than those, including atheists, who happen to offend the Scouts' somewhat woolier religious principles).
I also wonder whether the two professors have paused to consider in just what company they have placed themselves with their choice of analogy. The most conspicuous advocates of the idea that atheism is a religious conviction--comparable to, say, Catholicism--are fundamentalist Christians attempting to inject "Creationism" into the public-school curriculum. After all, if atheism is a religion, just like literalist Christianity, then it's perfectly valid to claim that Darwin's theory of evolution is as much a religious position as is the "Genesis theory" of human origins. Likewise, if absence of religion is just another religious creed, then school voucher programs that encompass confessional schools are not only Constitutionally permissible--they might even be mandatory, under the judiciary's current broad reading of the First Amendment, to prevent the government from "establishing" atheism, over all other religious doctrines, as the official "faith" of the public school system.
But atheism is not a religion; the absence of religion is very different from the presence of one. Teaching evolutionary biology in science class, and rejecting all of its religious alternatives, is not the same as teaching a single religious alternative. Forbidding the promulgation of any religion in public schools is nothing at all like exclusively promulgating a single one. And likewise the Boy Scouts, in requiring their members to affiliate with a religion--any religion--are not "excluding" a particular religion. Kleiman and Volokh should be happy about that; the consequences of atheism being designated by convention as just another religious belief would be most unlikely to please either of them.
Monday, October 28, 2002
As I read the eulogies to the late Senator Paul Wellstone, I notice an odd absence: there seems to be precious little of what one would normally define as, well, praise. Here's a typical one, from Mickey Kaus: "He wasn't a poser, a trimmer, a schemer, a dissembler, a self-aggrandizing egomaniac or a vicious infighter." Joshua Micah Marshall: "Most successful pols are steely operators. Not a few act serious, without at all being serious, but are rather jokes and whores. Or if they're first-rate men or women they've long since gotten gated-off behind walls of flacks, caution and self-protection. Paul Wellstone just wasn't like that." And these are liberals (albeit somewhat centrist ones).
Now, part of the problem is that Wellstone didn't leave a spectacular legacy of public-sphere accomplishments. The encomia from several leftist allies of his at Mother Jones, for example, attest mainly to his personal warmth and his various quixotic political stances, apparently lacking a concrete achievement to celebrate. That's not intended as a criticism; being mourned as a beloved husband and father, dear friend to many, and general exemplar of integrity to all is in itself the kind of high honor to which anyone ought to aspire. It's just that one might expect members of the most exclusive club in the world to be associated with a somewhat lengthier list of specific acts of heroism, leadership, or generosity, especially if they are being widely and publicly mourned as outstanding men.
An obvious explanation for all the accolades is implied in Marshall's observation: Wellstone's decency was in itself a rare, and hence outstanding, accomplishment for a politician. But there must be more to the story--after all, politics is hardly the only profession known to attract a disproportionate share of creeps. Yet we don't see, say, CEOs of large corporations, Hollywood celebrities or rock stars receiving fulsome posthumous praise, despite a lack of notable achievements, simply for having been known to friends and family as sweet, fuzzily huggable all-around princes. Why, then, does a merely non-reptilian politician inspire such enthusiasm?
I think the answer lies in the popular mythology of (small-d) democratic politics--the "Mr. Smith Goes to Washington" ideal of a politician-as-ordinary-person humbly representing his fellow citizens in the halls of the powerful. CEOs and showbiz types are expected to be vain and ruthless more or less as a job requirement; but the slimeball politician, despite his ubiquity, is somehow seen as a disturbance in the natural order of things.
Perhaps this illusion is a necessary one, in order to prevent voters from losing faith in democracy altogether. (Then again, few abandon either capitalism or pop culture upon discovering its heroes' warts.) But even among those willing to recognize that the late Sen. Wellstone's principled personability and his political ineffectuality might have been related, it seems that not one of them is ready to concede that politics is no more about principle than is tycoonhood or stardom, and that it, no less than commerce or entertainment, is a vehicle by which the morally empty can still (under the right regime of constraints) play a useful role that benefits society. Instead, everyone hopes, searches, naively, desperately, for that absurd chimera: the brilliantly effective politician who's as honest, straightforward and principled as the late Paul Wellstone.
Now, part of the problem is that Wellstone didn't leave a spectacular legacy of public-sphere accomplishments. The encomia from several leftist allies of his at Mother Jones, for example, attest mainly to his personal warmth and his various quixotic political stances, apparently lacking a concrete achievement to celebrate. That's not intended as a criticism; being mourned as a beloved husband and father, dear friend to many, and general exemplar of integrity to all is in itself the kind of high honor to which anyone ought to aspire. It's just that one might expect members of the most exclusive club in the world to be associated with a somewhat lengthier list of specific acts of heroism, leadership, or generosity, especially if they are being widely and publicly mourned as outstanding men.
An obvious explanation for all the accolades is implied in Marshall's observation: Wellstone's decency was in itself a rare, and hence outstanding, accomplishment for a politician. But there must be more to the story--after all, politics is hardly the only profession known to attract a disproportionate share of creeps. Yet we don't see, say, CEOs of large corporations, Hollywood celebrities or rock stars receiving fulsome posthumous praise, despite a lack of notable achievements, simply for having been known to friends and family as sweet, fuzzily huggable all-around princes. Why, then, does a merely non-reptilian politician inspire such enthusiasm?
I think the answer lies in the popular mythology of (small-d) democratic politics--the "Mr. Smith Goes to Washington" ideal of a politician-as-ordinary-person humbly representing his fellow citizens in the halls of the powerful. CEOs and showbiz types are expected to be vain and ruthless more or less as a job requirement; but the slimeball politician, despite his ubiquity, is somehow seen as a disturbance in the natural order of things.
Perhaps this illusion is a necessary one, in order to prevent voters from losing faith in democracy altogether. (Then again, few abandon either capitalism or pop culture upon discovering its heroes' warts.) But even among those willing to recognize that the late Sen. Wellstone's principled personability and his political ineffectuality might have been related, it seems that not one of them is ready to concede that politics is no more about principle than is tycoonhood or stardom, and that it, no less than commerce or entertainment, is a vehicle by which the morally empty can still (under the right regime of constraints) play a useful role that benefits society. Instead, everyone hopes, searches, naively, desperately, for that absurd chimera: the brilliantly effective politician who's as honest, straightforward and principled as the late Paul Wellstone.
Friday, October 25, 2002
Devilishly clever monster that he is, the "beltway sniper" and his accomplice seem to have timed their actions perfectly to first mislead foolish bloggers such as myself into pegging him as a possible terrorist, then humiliate us by shifting his behavior to fit the profile of a typical none-too-bright, rather disorganized lone nutbar. (Oddly enough, some bloggers, including Instapundit Glenn Reynolds, are still seeing hints of terrorism in the latest developments. We can only hope, I suppose, that all terrorist organizations are clueless enough to leave incoherent notes for the authorities, telephone them repeatedly with hints about their identities, and demand that $10 million be credited to their credit cards. Kudos to Susanna Cornett for her prescient early analysis.)
Still, the entire episode contains important lessons about terrorism and the best way to respond to it. After all, if a couple of lunatics could severely disrupt the entire capital region with a few random murders, what could a well-organized terrorist campaign do? And why were the public so terrified--terrorized, if you will--by a fairly small-scale random crime spree?
Two Washington Post opinion pieces published at the hight of the hysteria, by Paul Appelbaum and, earlier, by Marjorie Williams, argued that the extreme public fear stemmed from the lack of a clear pattern that can allow people to reduce (or simply to tell themselves they're reducing) their risk by taking certain precautions. "This killer seems especially frightening for his apparent determination to mirror, in the randomness of his acts, the brute impartiality of death itself," writes Williams. "This fear....is worse than most because of the unpredictability of the threat," writes Appelbaum.
No doubt that's part of the story. But unpredictable dangers--lethal diseases, freak accidents, or outbursts of violence--are hardly unfamiliar, and many of them (and certainly all of them together) are no rarer or more escapable than the DC sniper. Indeed, they are often less so; there was in fact a fairly established pattern of behavior on the sniper's part that suggested some obvious techniques for minimizing one's risk of becoming his next victim. What was particularly disturbing about the sniper, I believe, was that the mystery of his motive implied an open-endedness about the scale of the danger that he (or his kind) posed. Had he been known to be a typical psychotic, homicidally disgruntled crackpot, serial killer or even terrorist, DC-area suburbanites would have had some idea of the expected scale, frequency and targets of his actions; these are by now well-studied types whose behavior we can at least measure and assess, even if we can't understand (much less predict) it. But the cold-blooded distance-killers in this case seemed more like a new, previously undiscovered disease, with an unknown etiology--and no one had any idea how bad the epidemic could eventually get. How long would he continue? Would he ever get caught? Would there be copycats, and if so, how many? Would the danger spread elsewhere, or to other targets?
The optimistic flip-side of this observation is that as time goes on, and a fairly clear pattern of events emerges, with the killer either being captured, discontinuing his attacks, or continuing them at a constant or diminishing rate, such dangers eventually pass into the realm of estimable risks, and the anxiety they trigger thus declines. (Think of the case of Israel, where far worse atrocities occur regularly, with far less public reaction--indeed, often virtually no reaction at all, as one Israeli blogger recently reported.) Thus, though the mood of terror gripping the capital region may have been intense during the snipers' rampage, it is unlikely that a similar sequence of attacks will ever again have such a paralyzing effect. In fact, terrorism in general seems to suffer from this fundamental flaw--that societies inevitably develop a tolerance for shock, and more and more extreme (and thus difficult and dangerous) acts of violence are thus necessary to effect the intended level of fear and despair.
If this analysis is correct, then the DC-area and federal authorities may have erred by keeping such a tight lid on the details of their investigation. The existence and contents of multiple tarot-card messages, for example, may have frightened the public further had they been revealed; but they might also have, by creating a kind of public "profile" of the killer, contributed to a general sense of his knowability, even predictability, and thus dispelled some of the more open-ended scenarios (numerous co-ordinated terrorist cells, for instance) that contributed greatly to the public's fear. Should another serious terrorist attack (God forbid) occur, the authorities may want to keep this effect in mind, as they consider what to reveal about their investigation.
Still, the entire episode contains important lessons about terrorism and the best way to respond to it. After all, if a couple of lunatics could severely disrupt the entire capital region with a few random murders, what could a well-organized terrorist campaign do? And why were the public so terrified--terrorized, if you will--by a fairly small-scale random crime spree?
Two Washington Post opinion pieces published at the hight of the hysteria, by Paul Appelbaum and, earlier, by Marjorie Williams, argued that the extreme public fear stemmed from the lack of a clear pattern that can allow people to reduce (or simply to tell themselves they're reducing) their risk by taking certain precautions. "This killer seems especially frightening for his apparent determination to mirror, in the randomness of his acts, the brute impartiality of death itself," writes Williams. "This fear....is worse than most because of the unpredictability of the threat," writes Appelbaum.
No doubt that's part of the story. But unpredictable dangers--lethal diseases, freak accidents, or outbursts of violence--are hardly unfamiliar, and many of them (and certainly all of them together) are no rarer or more escapable than the DC sniper. Indeed, they are often less so; there was in fact a fairly established pattern of behavior on the sniper's part that suggested some obvious techniques for minimizing one's risk of becoming his next victim. What was particularly disturbing about the sniper, I believe, was that the mystery of his motive implied an open-endedness about the scale of the danger that he (or his kind) posed. Had he been known to be a typical psychotic, homicidally disgruntled crackpot, serial killer or even terrorist, DC-area suburbanites would have had some idea of the expected scale, frequency and targets of his actions; these are by now well-studied types whose behavior we can at least measure and assess, even if we can't understand (much less predict) it. But the cold-blooded distance-killers in this case seemed more like a new, previously undiscovered disease, with an unknown etiology--and no one had any idea how bad the epidemic could eventually get. How long would he continue? Would he ever get caught? Would there be copycats, and if so, how many? Would the danger spread elsewhere, or to other targets?
The optimistic flip-side of this observation is that as time goes on, and a fairly clear pattern of events emerges, with the killer either being captured, discontinuing his attacks, or continuing them at a constant or diminishing rate, such dangers eventually pass into the realm of estimable risks, and the anxiety they trigger thus declines. (Think of the case of Israel, where far worse atrocities occur regularly, with far less public reaction--indeed, often virtually no reaction at all, as one Israeli blogger recently reported.) Thus, though the mood of terror gripping the capital region may have been intense during the snipers' rampage, it is unlikely that a similar sequence of attacks will ever again have such a paralyzing effect. In fact, terrorism in general seems to suffer from this fundamental flaw--that societies inevitably develop a tolerance for shock, and more and more extreme (and thus difficult and dangerous) acts of violence are thus necessary to effect the intended level of fear and despair.
If this analysis is correct, then the DC-area and federal authorities may have erred by keeping such a tight lid on the details of their investigation. The existence and contents of multiple tarot-card messages, for example, may have frightened the public further had they been revealed; but they might also have, by creating a kind of public "profile" of the killer, contributed to a general sense of his knowability, even predictability, and thus dispelled some of the more open-ended scenarios (numerous co-ordinated terrorist cells, for instance) that contributed greatly to the public's fear. Should another serious terrorist attack (God forbid) occur, the authorities may want to keep this effect in mind, as they consider what to reveal about their investigation.
Wednesday, October 23, 2002
Life imitates comedy...
"I began using the date rape drug Rohypnol. I took it twenty times. I didn't know you were supposed to give it to the woman."--From "The Autobiography of Larry Sanders", by Garry Shandling and David Rensin (quoted in Salon)
"Actor Nick Nolte was driving under the influence of the date-rape drug GHB when he was arrested last month, dazed and drooling, behind the wheel of his automobile, prosecutors charged today."--Reuters
Friday, October 18, 2002
It is not particularly shocking (pace some bloggers) that the New York Times would publish an opinion piece by Mohammed Aldouri, the Iraqi ambassador to the United Nations. After all, the op-ed page is meant to provide a forum for a variety of viewpoints, including, especially, those not normally given voice in the rest of the newspaper. Nor is it surprising that Ambassador Aldouri would complain that "[f]or more than 11 years, the people of Iraq have suffered under United Nations economic sanctions, which have been kept in place largely by American influence," and that "no American political figure has been seriously interested in discussing these matters with our government."
But consider for a moment how the Times reacted this past May, when Canadian political science professor Anne Bayefsky submitted an opinion piece on the UN's human rights activities. According to Prof. Bayefsky, the piece was accepted only on the condition "that [its] dynamic be significantly altered", and that numerous passages be deleted. After "six new drafts, four additional drafts with smaller changes and corrections, seven drafts from the editors and 6 hours of editing by telephone", the op-ed was finally published; excised were passages in the original that noted the membership of "some of the most notorious human rights violators in the world today: China, Cuba, Libya, Saudi Arabia, and Syria" in the UN Human Rights Commission, and the failure of that organization to take action regarding human rights violations in China, Syria or Iran.
Now, we don't know how ruthlessly Ambassador Aldouri's writing was edited prior to publication, and it is in any event entirely the Times' prerogative to control the contents of its newspaper as it pleases. However, we are also entitled to judge the Times based on its choices, and we should note that the Gray Lady is happy to publish criticism of American influence at the UN--but not of Cuban, Saudi or Syrian influence there; and of the failure of American officials to embrace the Iraqi government--but not of the failure of UN officials to embrace victims of Chinese or Syrian repression. Such decisions are simply incompatible with a spirit of diversity of opinion, and are extremely difficult to explain without positing a deliberate effort on the Times' part to protect and indulge several of the most brutal, murderous dictatorships on the face of the earth.
But consider for a moment how the Times reacted this past May, when Canadian political science professor Anne Bayefsky submitted an opinion piece on the UN's human rights activities. According to Prof. Bayefsky, the piece was accepted only on the condition "that [its] dynamic be significantly altered", and that numerous passages be deleted. After "six new drafts, four additional drafts with smaller changes and corrections, seven drafts from the editors and 6 hours of editing by telephone", the op-ed was finally published; excised were passages in the original that noted the membership of "some of the most notorious human rights violators in the world today: China, Cuba, Libya, Saudi Arabia, and Syria" in the UN Human Rights Commission, and the failure of that organization to take action regarding human rights violations in China, Syria or Iran.
Now, we don't know how ruthlessly Ambassador Aldouri's writing was edited prior to publication, and it is in any event entirely the Times' prerogative to control the contents of its newspaper as it pleases. However, we are also entitled to judge the Times based on its choices, and we should note that the Gray Lady is happy to publish criticism of American influence at the UN--but not of Cuban, Saudi or Syrian influence there; and of the failure of American officials to embrace the Iraqi government--but not of the failure of UN officials to embrace victims of Chinese or Syrian repression. Such decisions are simply incompatible with a spirit of diversity of opinion, and are extremely difficult to explain without positing a deliberate effort on the Times' part to protect and indulge several of the most brutal, murderous dictatorships on the face of the earth.
Wednesday, October 16, 2002
First of all, there was the timing: in concert, it seemed, with a sudden, deadly outburst of Al Qaida terrorist activity. Then there was the methodology: a strange combination of lone-nutbar tactics (cheesy, taunting warnings), extraordinary high-tech skill, and such meticulous planning and care that virtually no usefully traceable evidence has been found to date. And then there was the choice of victims: though all of Washington, DC was terrorized, the death toll was actually very small--a few random, unlucky souls who happened to be in the wrong place at the wrong time. And none of it fit any of the standard patterns: too impersonal for a typical serial killer; too indiscriminate for a revenge killer or a fanatic with an agenda; and too ineffective for a terrorist cell, which could no doubt easily cause far greater damage and mayhem--if those were really its goals--given the deadly techniques and skill levels it had displayed. It's no wonder, then, that although the FBI's primary hypothesis continues to characterize the perpetrator as a lone American-born male with a military background that provided him with the requisite skills, no solid evidence--let alone a culprit--has turned up to confirm or even support this guess.
I am referring, of course, to last fall's anthrax mailings. And one of the most conspicuous features of that spate of attacks, it should be recalled, was that it ended as quickly as it began--again, not fitting any of the typical patterns of a psychopath, domestic radical or foreign-based terrorist. I have a sneaking suspicion that the DC sniper will also suddenly halt his activities (if he hasn't already, now that the heat is on) and disappear without a trace. If so, then perhaps it is time to consider a new hypothesis: that some terrorist organization has decided to cheaply and anonymously generate periodic panics, rather than massive casualty counts.
The obvious next questions: "who?", and "why?"
I am referring, of course, to last fall's anthrax mailings. And one of the most conspicuous features of that spate of attacks, it should be recalled, was that it ended as quickly as it began--again, not fitting any of the typical patterns of a psychopath, domestic radical or foreign-based terrorist. I have a sneaking suspicion that the DC sniper will also suddenly halt his activities (if he hasn't already, now that the heat is on) and disappear without a trace. If so, then perhaps it is time to consider a new hypothesis: that some terrorist organization has decided to cheaply and anonymously generate periodic panics, rather than massive casualty counts.
The obvious next questions: "who?", and "why?"
Saturday, October 12, 2002
When Instapundit Glenn Reynolds agrees with the New York Times editorial page, something funny is surely afoot. In this case, both are encouraging the Supreme Court to overturn--on Constitutional grounds, mind you--a 1998 law extending the duration of copyrights by an additional twenty years after the creator's death. Their argument is that the Constitution's language empowers Congress to set copyright law for "limited times", in order "to promote the progress of science and the useful arts", and that extending copyrights on existing works cuts undercuts both of these expressed intentions, by threatening to make copyright terms indefinite, and rewarding creators long after they have ceased to be able to respond to the increased incentive.
Now, I sympathize with these arguments, and freely concede that copyright protections might deserve some weakening, particularly with respect to duration. (Patents only get twenty years, after all.) But deciding the correct length of copyright terms is a matter of balancing the benefits to society from rewarding creators against those that the public garners by having free (or freer) access to their work. Gauging that balance is exactly the kind of public policy question that the courts are completely unqualified to decide, and should leave to the democratic process, in all its imperfect glory, to work out for itself. Sadly, the judiciary's hubris with respect to judging what's good for the public knows no bounds these days, and I have no confidence that the Supreme Court will exercise any uncharacteristic restraint in this case.
Now, I sympathize with these arguments, and freely concede that copyright protections might deserve some weakening, particularly with respect to duration. (Patents only get twenty years, after all.) But deciding the correct length of copyright terms is a matter of balancing the benefits to society from rewarding creators against those that the public garners by having free (or freer) access to their work. Gauging that balance is exactly the kind of public policy question that the courts are completely unqualified to decide, and should leave to the democratic process, in all its imperfect glory, to work out for itself. Sadly, the judiciary's hubris with respect to judging what's good for the public knows no bounds these days, and I have no confidence that the Supreme Court will exercise any uncharacteristic restraint in this case.
Monday, October 07, 2002
In Slate, Robert Weintraub claims that football quarterback-turned-commentator Boomer Esiason's stint on Monday Night Football in 1998-99 "was undercut by a frosty relationship with Al Michaels, major domo of MNF", who "froze out the cocky Esiason early in their tenure together". But, says, Weintraub, things are much better now that Esiason is working with Marv Albert, who has the "ability to elicit the best out of whomever he is partnered with" (although not with his teeth this time, presumably).
Imagine--someone teaming up with Marv Albert as a way to avoid backbiting....
Imagine--someone teaming up with Marv Albert as a way to avoid backbiting....
Sunday, October 06, 2002
A postscript to my previous posting: According to the New York Times, the New Jersey court system had already granted Toricelli's (and now Lautenberg's) Republican opponent, Douglas Forrester, an exemption a few months ago from the same law that the Supreme Court ignored in allowing Lautenberg to replace Torricelli. Partisan Democrats like Mark Kleiman and Joshua Micah Marshall are waving this discovery as a victory banner, and it's certainly highly likely that the matter is now closed as a political issue. But as an actual defense of the court's behavior, the comparison is a pure "tu quoque" argument of no exculpatory value; interfering with the electoral process to grant injunctions in flagrant violation of state law is more--not less--outrageous if the court has actually done so twice rather than once.
The argument does, however, neatly illustrate the crucial interplay between ruthless partisanship and the erosion of democracy. The first time the New Jersey court rewrote the election statute, there was no significant outcry, presumably because the Republicans involved preferred to downplay an intraparty squabble rather than stand firm on a matter of principle. (Or perhaps, like so many Americans, they had simply been too inured by decades of judicial overreaching even to notice the shame of it anymore.) And the second time, when interests in the outcome split along party lines, most of those on the "victorious" side thought nothing of accepting (even glorying in) a partisan victory at the expense of repect for duly enacted legislation. Just as war results when at least one side of a conflict values victory over peace, democracy flounders when at least one powerful partisan faction values victory over the preservation of the democratic process.
One of the few silver linings of Bush v. Gore was seeing some of my good progressive friends, raised from birth on liberal torturings of the Constitution and steeped in the belief that the judiciary is morally superior to the elected branches of government, suddenly feel a twinge of doubt creep into their blind faith in the Supreme Court. Sadly, that feeling vanished just about as quickly as it arose. In fact, the greater long-term effect seems to have been to further whet the appetites of conservatives; having long cultivated a bitter disdain for the activist judiciary as a decadent redoubt of rigid liberalism, they've now had a glimpse of what a few arrogant, dictatorial justices can do for them, and--wouldn't you know--they rather like it.
The result can be seen in a Washington Post "man in the street" piece which, says Joshua Micah Marshall, proves that "everyone but hardcore Republicans seems fine with" the New Jersey Supreme Court's ruling. In fact, the article portrays intensely cynical voters playing their appointed parts in the partisan charade: among Republicans, the article reports, "outrage was extreme", while "Democrats said overwhelmingly that they're so relieved to be rid of Torricelli that it cancels out their reservations on how it occurred." Is there any reason to believe that my scenario of a Supreme Court exploiting partisan divisions in the elected branches to seize control of its own succession would play out any differently?
The argument does, however, neatly illustrate the crucial interplay between ruthless partisanship and the erosion of democracy. The first time the New Jersey court rewrote the election statute, there was no significant outcry, presumably because the Republicans involved preferred to downplay an intraparty squabble rather than stand firm on a matter of principle. (Or perhaps, like so many Americans, they had simply been too inured by decades of judicial overreaching even to notice the shame of it anymore.) And the second time, when interests in the outcome split along party lines, most of those on the "victorious" side thought nothing of accepting (even glorying in) a partisan victory at the expense of repect for duly enacted legislation. Just as war results when at least one side of a conflict values victory over peace, democracy flounders when at least one powerful partisan faction values victory over the preservation of the democratic process.
One of the few silver linings of Bush v. Gore was seeing some of my good progressive friends, raised from birth on liberal torturings of the Constitution and steeped in the belief that the judiciary is morally superior to the elected branches of government, suddenly feel a twinge of doubt creep into their blind faith in the Supreme Court. Sadly, that feeling vanished just about as quickly as it arose. In fact, the greater long-term effect seems to have been to further whet the appetites of conservatives; having long cultivated a bitter disdain for the activist judiciary as a decadent redoubt of rigid liberalism, they've now had a glimpse of what a few arrogant, dictatorial justices can do for them, and--wouldn't you know--they rather like it.
The result can be seen in a Washington Post "man in the street" piece which, says Joshua Micah Marshall, proves that "everyone but hardcore Republicans seems fine with" the New Jersey Supreme Court's ruling. In fact, the article portrays intensely cynical voters playing their appointed parts in the partisan charade: among Republicans, the article reports, "outrage was extreme", while "Democrats said overwhelmingly that they're so relieved to be rid of Torricelli that it cancels out their reservations on how it occurred." Is there any reason to believe that my scenario of a Supreme Court exploiting partisan divisions in the elected branches to seize control of its own succession would play out any differently?
Friday, October 04, 2002
The most appalling thing about the recent ruling of the New Jersey Supreme Court is how few people are appalled anymore when a court runs roughshod over the law. The New York Times was predictably satisfied with the ruling, in which the court interpreted the phrase, "not later than the 51st day before the election" in the New Jersey election statute to mean, "later than the 51st day before the election, if we feel like it", because "the greater need was to ensure 'full and fair ballot choice for the voters of New Jersey.'" (As Robert Hochmann pointed out in the Weekly Standard, the Times' notion of a "full and fair ballot choice" apparently includes a special slot on the ballot for a candidate to be selected by the New Jersey State Democratic Party leadership, irrespective of the state primary result; for that is what the ruling granted.) Joshua Micah Marshall termed it "a liberal, though not unreasonable, construction of the statute"--even as he dropped sarcastic comments about the travesty that was Bush v. Gore. (In other words, when judging judges, it all depends whose Bush is Gored.) The folks at The American Prospect's blog, Tapped, bought the court's supposed legal reasoning as well. The New Republic's blog conceded in passing that the ruling arguably "violates some cherished abstract principle like rule of law", but chose to concentrate instead on its practical harmlessness as a legal precedent. The most dignified response I've seen from a partisan Democrat is (perhaps unsurprisingly) that of Mark Kleiman, who noted with some discomfort that "the decision doesn't even pretend to interpret the statute", but otherwise accepted the good tidings philosophically, invoking Bush v. Gore and in effect saying, "this is the way we live now".
But even among those who disagree with the ruling, a shocking (but perhaps entirely predictable) number are treating the case not as a flagrant abuse of judicial authority, but rather as an incorrect, unwise, and possibly corrupt use of a legitimate judicial responsibility. Hochmann, for instance, feels compelled to argue that "the decision threatens to poison our electoral politics with last-minute manipulations", as if it would have been a completely legitimate overruling of the statute, but for the potentially unpleasant outcomes that render it dangerous. Eugene Volokh (a lawyer and a libertarian, to be sure, and hence entirely untrustworthy as a defender of democracy) even concedes that the court's "odd interpretation of the statute" was "not an utterly ridiculous one", and then constructs a long hypothetical meant to prove that the decision was nevertheless not necessarily a wise court's best option. All of this is a disturbing echo of the 2000 presidential election, in which Republicans burned by the Florida Supreme Court ran around talking about the sacredness of machine counts and appealing to the Supreme Court on completely implausible Constitutional grounds--happily jumping into the same jurisprudential mud that the Democrats had sullied themselves with--instead of defending the orderly functioning of the electoral process (as far as the House of Representatives, if necessary) unimpeded by the irresponsible meddling of power-mad judges.
Granted, there have been some voices (such as John Fund's, in the Wall Street Journal) willing to attack the judicialization of elections head-on. More typical, though, is a kind of cynical resignation, such as Robert Alt's in the National Review: "I know that it is too much to ask the court to actually apply the law, but....I expect judges to pretend that they are interpreting a statute, even if what they are really doing is rewriting it." (Alt also goes on to argue for the wisdom of the New Jersey statute's 51st-day deadline--again, as if the court would have been entitled to overturn an unwise one--and to suggest some potential legal justifications for a federal Supreme Court reversal.)
I take back everything I said about the elected branches of the US government being content to exercise their power vicariously, through their judicial selections. The situation is actually much worse: the day may not be far off when a president and Senate of the same party attempt to ram through a controversial judicial appointment, and the Supreme Court simply overrules their nomination, holding the vacancy open until such time as the executive and legislative branches are willing to do the Court's bidding and select a replacement deemed acceptable to it. Think about it--the outrage would most probably be brief, limited to whichever party deemed itself the "loser", and followed by the same muttering capitulation that we see today each time the courts take yet another step beyond the previous boundaries of their usurped powers. The long process of dismantling American democracy, and replacing it with pure judicial authoritarianism, would then, at last, be complete.
But even among those who disagree with the ruling, a shocking (but perhaps entirely predictable) number are treating the case not as a flagrant abuse of judicial authority, but rather as an incorrect, unwise, and possibly corrupt use of a legitimate judicial responsibility. Hochmann, for instance, feels compelled to argue that "the decision threatens to poison our electoral politics with last-minute manipulations", as if it would have been a completely legitimate overruling of the statute, but for the potentially unpleasant outcomes that render it dangerous. Eugene Volokh (a lawyer and a libertarian, to be sure, and hence entirely untrustworthy as a defender of democracy) even concedes that the court's "odd interpretation of the statute" was "not an utterly ridiculous one", and then constructs a long hypothetical meant to prove that the decision was nevertheless not necessarily a wise court's best option. All of this is a disturbing echo of the 2000 presidential election, in which Republicans burned by the Florida Supreme Court ran around talking about the sacredness of machine counts and appealing to the Supreme Court on completely implausible Constitutional grounds--happily jumping into the same jurisprudential mud that the Democrats had sullied themselves with--instead of defending the orderly functioning of the electoral process (as far as the House of Representatives, if necessary) unimpeded by the irresponsible meddling of power-mad judges.
Granted, there have been some voices (such as John Fund's, in the Wall Street Journal) willing to attack the judicialization of elections head-on. More typical, though, is a kind of cynical resignation, such as Robert Alt's in the National Review: "I know that it is too much to ask the court to actually apply the law, but....I expect judges to pretend that they are interpreting a statute, even if what they are really doing is rewriting it." (Alt also goes on to argue for the wisdom of the New Jersey statute's 51st-day deadline--again, as if the court would have been entitled to overturn an unwise one--and to suggest some potential legal justifications for a federal Supreme Court reversal.)
I take back everything I said about the elected branches of the US government being content to exercise their power vicariously, through their judicial selections. The situation is actually much worse: the day may not be far off when a president and Senate of the same party attempt to ram through a controversial judicial appointment, and the Supreme Court simply overrules their nomination, holding the vacancy open until such time as the executive and legislative branches are willing to do the Court's bidding and select a replacement deemed acceptable to it. Think about it--the outrage would most probably be brief, limited to whichever party deemed itself the "loser", and followed by the same muttering capitulation that we see today each time the courts take yet another step beyond the previous boundaries of their usurped powers. The long process of dismantling American democracy, and replacing it with pure judicial authoritarianism, would then, at last, be complete.
Tuesday, October 01, 2002
ANKARA (ICBW) - A canister intercepted near the Turkish border with Iraq, and originally believed to contain more than 33 pounds of weapons-grade uranium, has been discovered to be a souvenir paperweight.
Turkish customs officials were at first alarmed when they shook the roughly egg-shaped container and observed the tell-tale "blizzard" effect displayed only by highly radioactive nuclear bomb raw materials and inexpensive "tchotchkes".
Inspectors' suspicions were further aroused by what they termed the "transparently false" shipping label on the object, believed to have originated in one of the former Soviet republics, despite the "Niagara Falls, Canada" inscription along the bottom.
When asked to explain the smiling Mountie figurine at the center of the clear plastic ovoid, officials shrugged and replied, "surely we're not the first people to wonder about the composition of that white stuff that seems able to snow over and over again inside these things."
Turkish customs officials were at first alarmed when they shook the roughly egg-shaped container and observed the tell-tale "blizzard" effect displayed only by highly radioactive nuclear bomb raw materials and inexpensive "tchotchkes".
Inspectors' suspicions were further aroused by what they termed the "transparently false" shipping label on the object, believed to have originated in one of the former Soviet republics, despite the "Niagara Falls, Canada" inscription along the bottom.
When asked to explain the smiling Mountie figurine at the center of the clear plastic ovoid, officials shrugged and replied, "surely we're not the first people to wonder about the composition of that white stuff that seems able to snow over and over again inside these things."
Subscribe to:
Posts (Atom)