Mark Kleiman has been assiduously covering the intriguing story of a possible grave national security misstep on the part of the Bush administration. Apparently, a woman named Valerie Plame Wilson, who may or may not be a covert CIA agent, may or may not have had her cover blown by someone who may or may not be a senior administration official. The alleged agent's alleged status was supposedly leaked as part of a smear campaign against her husband, a former US ambassador to Gabon who publicly criticized the administration's use of the now-legendary Nigerien Yellowcake Uranium story as a justification for launching the Iraq campaign.
The details of the whole affair are terribly murky at this point, but it certainly seems worthy of serious investigation. Strangely, though, the press is giving the story very little attention. Kleiman attributes this silence to a lapdog press that depends too much on White House sources for news leaks to burn its bridges by exposing a serious scandal.
I think the real explanation, though not much less comforting, is somewhat less venal: the press has locked itself into a terribly rigid model of politics as partisan left-right combat, and has enormous difficulty covering stories that can't be cast perfectly in terms of this model. Senator Robert Byrd's occasionally intemperate language regarding race, for instance, or the recent dust-up involving homosexual epithets lobbed by a Democratic Representative, barely made waves, because the Democrats are understood to be the party that defends minorities, both racial and sexual. Likewise, the Republicans are the party of national security hawks, and a story that has them exposing CIA agents like a bunch of campus radicals left over from the sixties simply doesn't fit the conventions.
If I had to guess, I'd surmise that somebody at the White House (maybe at the highest reaches, maybe not) stepped over the line, and is now patiently waiting for either (a) the whole thing to blow over or (b) his/her head to roll. Certainly, reporters should be much more attentive to this case--and likely would be, but for its stubborn inability to fit into their preconceived notions of what a story about a Republican administration should look like.
Wednesday, July 30, 2003
Crooked Timber's Kieran Healy argues That Michael Totten's proposal for no-holds-barred warfare against Palestinian terrorism is naively simplistic. He presents two arguments, one moral and one practical. Both are specious.
First the moral case: "the hawks," writes Healy, "never seem to pause to think how they might react if they and their kin were the targets of the kind of policy Totten advocates." If? The terrorists are already targeting them and their kin. The "hawks", on the other hand, are doing nothing of the kind; even the most callous disregard for civilian life, in pursuit of terrorists, is not morally equivalent to the deliberate murder of civilians. And I'm sure even Totten would support exercising maximal caution in protecting civilians--consistent, that is, with destroying the terrrorist organizations that hide among them. It is Healy, not Totten, who seems to have trouble understanding what it's like to be an innocent civilian targeted for murder. In fact, he can't even distinguish the experience from that of being a terrorist targeted as a combatant.
Healy's practical argument is no better: he asserts that harsh measures against terrorist groups with a large population of sympathizers merely turn the sympathizers into active supporters. This statement actually says nothing about the effectiveness of dealing harshly with such terrorist organizations--especially compared with the alternative strategy of appeasing them. But it does imply that this method ultimately involves killing so many people ("a river of blood") that it can't possibly be moral. In other words, it's really just the previous specious moral argument, dressed up as a practical one: aggressively combatting a large, popular terrorist organization is morally no better than slaughtering random members of a large, peaceful population.
Taken together, these arguments amount to a kind of pacifism. In effect, they assert that a population--however weak and vulnerable--that is sufficiently enthusiastic about murdering the innocent civilians of another population must be allowed to do so with impunity, while the latter population, if it abhors such slaughter, must meekly accept its own victimization, however capable it is of defending itself. Absolute pacifism has a certain elegantly simplistic appeal, of course, but it's hardly a vantage point from which to deride Totten's hawkishness as hopelessly unsophisticated.
First the moral case: "the hawks," writes Healy, "never seem to pause to think how they might react if they and their kin were the targets of the kind of policy Totten advocates." If? The terrorists are already targeting them and their kin. The "hawks", on the other hand, are doing nothing of the kind; even the most callous disregard for civilian life, in pursuit of terrorists, is not morally equivalent to the deliberate murder of civilians. And I'm sure even Totten would support exercising maximal caution in protecting civilians--consistent, that is, with destroying the terrrorist organizations that hide among them. It is Healy, not Totten, who seems to have trouble understanding what it's like to be an innocent civilian targeted for murder. In fact, he can't even distinguish the experience from that of being a terrorist targeted as a combatant.
Healy's practical argument is no better: he asserts that harsh measures against terrorist groups with a large population of sympathizers merely turn the sympathizers into active supporters. This statement actually says nothing about the effectiveness of dealing harshly with such terrorist organizations--especially compared with the alternative strategy of appeasing them. But it does imply that this method ultimately involves killing so many people ("a river of blood") that it can't possibly be moral. In other words, it's really just the previous specious moral argument, dressed up as a practical one: aggressively combatting a large, popular terrorist organization is morally no better than slaughtering random members of a large, peaceful population.
Taken together, these arguments amount to a kind of pacifism. In effect, they assert that a population--however weak and vulnerable--that is sufficiently enthusiastic about murdering the innocent civilians of another population must be allowed to do so with impunity, while the latter population, if it abhors such slaughter, must meekly accept its own victimization, however capable it is of defending itself. Absolute pacifism has a certain elegantly simplistic appeal, of course, but it's hardly a vantage point from which to deride Totten's hawkishness as hopelessly unsophisticated.
Monday, July 28, 2003
Mark Kleiman and Eric Rescorla are furious about a report that American forces in Iraq captured a fugitive Iraqi general by detaining his wife and daughter, and leaving a note demanding his surrender in return for their release. Kleiman calls it a "crime"; Rescorla calls it "terrorism". Matthew Yglesias agrees, and in addition thinks it terribly unwise, because of the hostility it will engender.
They all seem to be missing a key element of the report, though. In fact, the colonel in command justified the action as "an intelligence operation with detainees," and explained that the fugitive's family "would have been released in due course," regardless. In other words, he asserts that the threatening note was nothing but a bluff, to get the Iraqi general's imagination working overtime.
It's possible, of course, that the colonel was just covering his posterior, and really had, in effect, taken the Iraqi family hostage--or has at least ventured out on a slippery slope that will inevitably end in his (or another commander's) doing so. (After all, such bluffs are only effective until the first time one of them is called, and there will be an inevitable temptation at that point to "up the ante".) But while I understand the concerns of Kleiman et al., I'd personally be much more careful about jumping to conclusions before blithely asserting that a war crime had just taken place.
They all seem to be missing a key element of the report, though. In fact, the colonel in command justified the action as "an intelligence operation with detainees," and explained that the fugitive's family "would have been released in due course," regardless. In other words, he asserts that the threatening note was nothing but a bluff, to get the Iraqi general's imagination working overtime.
It's possible, of course, that the colonel was just covering his posterior, and really had, in effect, taken the Iraqi family hostage--or has at least ventured out on a slippery slope that will inevitably end in his (or another commander's) doing so. (After all, such bluffs are only effective until the first time one of them is called, and there will be an inevitable temptation at that point to "up the ante".) But while I understand the concerns of Kleiman et al., I'd personally be much more careful about jumping to conclusions before blithely asserting that a war crime had just taken place.
Saturday, July 26, 2003
Chris Bertram of "Crooked Timber" is "pretty revolted by the suggestion that one day we might synthesize all our food," although he can't exactly say why. After some Orwell-quoting bloviation about humanity's "engagement with the natural world", he writes, "I just wish I could better articulate exactly what it is" that makes him so passionately "want [his] potatoes from the earth and [his] apples from a tree."
Well, I'm always happy to help out. To begin with, the "natural" or "traditional" foods with which we are familiar are typically tastier than their artificial equivalents, because in most cases the only reason for the latter's existence is as an inexpensive ersatz substitute for the former. Unsurprisingly, many people therefore automatically associate "natural" with "high-quality" and "delicious", and "artificial" with "low-quality" and "semi-palatable", ignoring the exceptions. (A good counterexample is ice cream: the best brands may be high in natural ingredients, but there's still nothing the least bit natural about solidifying sweetened milk products in a freezer. Likewise for "old fashioned" seltzer.)
Of course, properties such as "natural", "high-quality" and "delicious" are also associated with "expensive", and we should not discount the social motivations that tempt many people to insist that less-than-authentically-natural foods (or clothes, or furnishings, or anything else) are simply unacceptable. Add to the mix the fact that understanding the distinction between "natural" or "authentic" and "artificial" or "ersatz" can require considerable knowledge (if only of the most current folklore and fashion), and we have an ideal vehicle for snobbery.
Finally, in the modern industrial world, "nature" cannot but retain an element of exotic glamor. The fable of the country mouse and the city mouse nicely illustrates the inevitable allure of unfamiliar environments with obvious temptations and hidden drawbacks. But now that practically everyone's a city mouse, the enormous disadvantages of a "natural" existence--discomfort, isolation, danger, tedium, economic unviability, ceaseless exhausting labor, the ruthlessness of the elements, the perennial toll of sickness, injury and scarcity, and all the rest--are easily forgotten. Likewise, "unprocessed", "unadulterated"--i.e., unpasteurized, uninspected, and unfortified--foods retain something of the the unrealistic appeal of wild, untamed nature.
We live in an era in which the production of our food has been almost completely industrialized, for better or worse. How many modern-day Prousts, for instance, could be moved to recall the evocative taste of a cookie adored in childhood that did not come from the supermarket wrapped in a package? Under these circumstances, then, it's entirely understandable that rarer, more expensive, more exotic "natural" foods would acquire a special cachet not bestowed upon their mass-produced counterparts. On the other hand, it would be foolish to exaggerate either the depth or the significance of this particular sentiment.
Well, I'm always happy to help out. To begin with, the "natural" or "traditional" foods with which we are familiar are typically tastier than their artificial equivalents, because in most cases the only reason for the latter's existence is as an inexpensive ersatz substitute for the former. Unsurprisingly, many people therefore automatically associate "natural" with "high-quality" and "delicious", and "artificial" with "low-quality" and "semi-palatable", ignoring the exceptions. (A good counterexample is ice cream: the best brands may be high in natural ingredients, but there's still nothing the least bit natural about solidifying sweetened milk products in a freezer. Likewise for "old fashioned" seltzer.)
Of course, properties such as "natural", "high-quality" and "delicious" are also associated with "expensive", and we should not discount the social motivations that tempt many people to insist that less-than-authentically-natural foods (or clothes, or furnishings, or anything else) are simply unacceptable. Add to the mix the fact that understanding the distinction between "natural" or "authentic" and "artificial" or "ersatz" can require considerable knowledge (if only of the most current folklore and fashion), and we have an ideal vehicle for snobbery.
Finally, in the modern industrial world, "nature" cannot but retain an element of exotic glamor. The fable of the country mouse and the city mouse nicely illustrates the inevitable allure of unfamiliar environments with obvious temptations and hidden drawbacks. But now that practically everyone's a city mouse, the enormous disadvantages of a "natural" existence--discomfort, isolation, danger, tedium, economic unviability, ceaseless exhausting labor, the ruthlessness of the elements, the perennial toll of sickness, injury and scarcity, and all the rest--are easily forgotten. Likewise, "unprocessed", "unadulterated"--i.e., unpasteurized, uninspected, and unfortified--foods retain something of the the unrealistic appeal of wild, untamed nature.
We live in an era in which the production of our food has been almost completely industrialized, for better or worse. How many modern-day Prousts, for instance, could be moved to recall the evocative taste of a cookie adored in childhood that did not come from the supermarket wrapped in a package? Under these circumstances, then, it's entirely understandable that rarer, more expensive, more exotic "natural" foods would acquire a special cachet not bestowed upon their mass-produced counterparts. On the other hand, it would be foolish to exaggerate either the depth or the significance of this particular sentiment.
Tuesday, July 22, 2003
Eugene Volokh and Matthew Yglesias are ridiculing the Secret Service for sending an agent to investigate an editorial cartoon depicting the president being assassinated. "The cartoonist is obviously not trying to threaten the President's life," observes Volokh--quite correctly, I might add. "[I]f the Secret Service is wasting its time on this....that shows pretty bad judgment on the part of its managers." Writes Yglesias: "you have to be extraordinarily unfamiliar with the concept of editorial cartooning to think that something like this could possibly constitute a death threat against the president."
I'm fairly confident, though, that "bad judgment" had nothing to do with the decision to send an agent around to question the cartoonist. The Secret Service has an iron-clad policy of investigating all such depictions--even those that common sense dictates are not serious threats. In this particular case, the cartoon was actually portraying the president in a sympathetic light, as a victim of a metaphorical assassination by political opponents. But the Secret Service launched a routine investigation anyway.
Why does the Service have such a rigid policy? Well, imagine if it were a matter of agent or managerial discretion within the service--as Volokh evidently believes it should be--to decide whether a given depiction of an assassination was plausibly a threat, and hence worth investigating. In no time flat, various political groups would start alleging that the Service's choices of which cases to investigate were politically biased against them (or against the nutbars allied with them). The Service would come under massive political pressure to "lay off" one type of threat after another, until the entire policy was in tatters, and routine investigations of oddball threats were abandoned altogether. Then one day one of those uninvestigated threats would turn out to be real, and accusations would fly about dangerous laxity at the Secret Service.
The real problem, of course, is not the Secret Service's policies regarding potential assassination threats, but rather America's bizarre antinomian culture, under which law enforcement officers are generally characterized as both all-powerful demonic monsters and useless, bumbling idiots. Perhaps if a routine visit from a Secret Service agent were seen as a welcome opportunity to help a hardworking public servant resolve any doubts and close a perfunctory investigation, instead of as a brutal assault by the Keystone Gestapo, the Secret Service would be able to exercise more discretion and direct its resources more efficiently.
And so, too, for that matter, would those Americans who obsess endlessly over grossly overblown allegations of "abuse of police power".
I'm fairly confident, though, that "bad judgment" had nothing to do with the decision to send an agent around to question the cartoonist. The Secret Service has an iron-clad policy of investigating all such depictions--even those that common sense dictates are not serious threats. In this particular case, the cartoon was actually portraying the president in a sympathetic light, as a victim of a metaphorical assassination by political opponents. But the Secret Service launched a routine investigation anyway.
Why does the Service have such a rigid policy? Well, imagine if it were a matter of agent or managerial discretion within the service--as Volokh evidently believes it should be--to decide whether a given depiction of an assassination was plausibly a threat, and hence worth investigating. In no time flat, various political groups would start alleging that the Service's choices of which cases to investigate were politically biased against them (or against the nutbars allied with them). The Service would come under massive political pressure to "lay off" one type of threat after another, until the entire policy was in tatters, and routine investigations of oddball threats were abandoned altogether. Then one day one of those uninvestigated threats would turn out to be real, and accusations would fly about dangerous laxity at the Secret Service.
The real problem, of course, is not the Secret Service's policies regarding potential assassination threats, but rather America's bizarre antinomian culture, under which law enforcement officers are generally characterized as both all-powerful demonic monsters and useless, bumbling idiots. Perhaps if a routine visit from a Secret Service agent were seen as a welcome opportunity to help a hardworking public servant resolve any doubts and close a perfunctory investigation, instead of as a brutal assault by the Keystone Gestapo, the Secret Service would be able to exercise more discretion and direct its resources more efficiently.
And so, too, for that matter, would those Americans who obsess endlessly over grossly overblown allegations of "abuse of police power".
Sunday, July 20, 2003
One of the most interesting strategies I've seen for dealing with North Korea is hidden between the lines of a Washington Post op-ed column by Senator Richard Lugar:
There is one country, however, that has real reasons to accept North Korean refugees in large numbers: South Korea. Apart from cultural, ethnic and even family affinities with the North, South Koreans have a long-term desire to reunify their peninsula, and thus an incentive to adopt policies that will hasten that end. Also, since such an outcome would result in a Southward flood of refugees in any event--and it seems very likely that that flood will occur some day, one way or another--it may actually be in South Korea's interest to absorb as many fleeing refugees as it can today.
If the South were to promise to grant asylum to any North Korean captured in China--thus unburdening China of its Korean refugee problem--then China might well consider changing its repatriation policy. The resulting flow of refugees, in addition to powerfully undermining the Northern tyranny, would be a humanitarian godsend for those who managed to escape.
All that's missing, of course, is the political will in South Korea, where the arrival of millions of terrorized, malnourished, destitute, uneducated Northerners is unfortunately viewed as a terrible threat rather than a sparkling opportunity. Perhaps South Koreans should look to the example of Israel, which has for decades absorbed Jewish refugees from all over the world, in numbers well beyond its "natural" capacity to do so. Sadly, the achievements of the Israeli model are rarely even given credit--let alone emulated--elsewhere.
[W]e should authorize the resettlement of some North Korean refugees in this country, and press our allies to do the same. If this sparks a greater flow of North Koreans from their gulag-like country, some would argue, that could help keep pressure on North Korea or even hasten the fall of the Pyongyang regime, much as the flight of East Germans in 1989 helped undermine the Communist system there.Of course, the number of refugees that the US and its allies would be willing to take would have little effect on the North Korean government. Moreover, by encouraging further flight, such half-measures would only increase China's North Korean refugee problem--and thus encourage China to return to its current policy of repatriating all North Korean refugees found on its soil.
There is one country, however, that has real reasons to accept North Korean refugees in large numbers: South Korea. Apart from cultural, ethnic and even family affinities with the North, South Koreans have a long-term desire to reunify their peninsula, and thus an incentive to adopt policies that will hasten that end. Also, since such an outcome would result in a Southward flood of refugees in any event--and it seems very likely that that flood will occur some day, one way or another--it may actually be in South Korea's interest to absorb as many fleeing refugees as it can today.
If the South were to promise to grant asylum to any North Korean captured in China--thus unburdening China of its Korean refugee problem--then China might well consider changing its repatriation policy. The resulting flow of refugees, in addition to powerfully undermining the Northern tyranny, would be a humanitarian godsend for those who managed to escape.
All that's missing, of course, is the political will in South Korea, where the arrival of millions of terrorized, malnourished, destitute, uneducated Northerners is unfortunately viewed as a terrible threat rather than a sparkling opportunity. Perhaps South Koreans should look to the example of Israel, which has for decades absorbed Jewish refugees from all over the world, in numbers well beyond its "natural" capacity to do so. Sadly, the achievements of the Israeli model are rarely even given credit--let alone emulated--elsewhere.
Saturday, July 19, 2003
Eugene Volokh has posted several times now, blasting the French for attempting to ban the anglicism "e-mail" from their language (at least as used by the government). What he doesn't touch on is the long history of bitter verbal injustice reflected in this callous action.
The American tradition (and to a lesser extent, the British one) is to welcome immigrant words into the language with open arms, assimilating them into the culture and embracing their innovative creativity and their willingness to do work disdained by native terminology. On the Continent, however, full citizenship in the language is zealously guarded, with the result that tired, insular, antiquated European tongues are by now completely ill-equipped for a dynamic, competitive world.
The stopgap solution adopted in those countries is to allow "guest words" from other languages (most often English) into their own, graciously permitting them to do the jobs that native words shun, but denying them equal status with their native-born peers. These foreign words, some of which have been in the language for generations, are often treated as outcasts, and subject to occasional expulsions or other repressive measures whenever the locals' seething xenophobic resentments reach a boiling point. "E-mail" is only the most recent victim of these ugly purges.
The real result of this mistreatment, of course, is to drive foreign words underground, turning them into an unofficial popular argot that is increasingly alienated from its surrounding society, and often quite hostile to it. In some countries, the very fabric of the language is being rent asunder, as the communities where foreign words are numerous form a local patois practically incomprehensible to "pure native" speakers. If Europe's linguistic authorities don't begin to deal with this terrible problem, they may soon see their beloved languages fracture completely into chaotic, warring dialects.
The American tradition (and to a lesser extent, the British one) is to welcome immigrant words into the language with open arms, assimilating them into the culture and embracing their innovative creativity and their willingness to do work disdained by native terminology. On the Continent, however, full citizenship in the language is zealously guarded, with the result that tired, insular, antiquated European tongues are by now completely ill-equipped for a dynamic, competitive world.
The stopgap solution adopted in those countries is to allow "guest words" from other languages (most often English) into their own, graciously permitting them to do the jobs that native words shun, but denying them equal status with their native-born peers. These foreign words, some of which have been in the language for generations, are often treated as outcasts, and subject to occasional expulsions or other repressive measures whenever the locals' seething xenophobic resentments reach a boiling point. "E-mail" is only the most recent victim of these ugly purges.
The real result of this mistreatment, of course, is to drive foreign words underground, turning them into an unofficial popular argot that is increasingly alienated from its surrounding society, and often quite hostile to it. In some countries, the very fabric of the language is being rent asunder, as the communities where foreign words are numerous form a local patois practically incomprehensible to "pure native" speakers. If Europe's linguistic authorities don't begin to deal with this terrible problem, they may soon see their beloved languages fracture completely into chaotic, warring dialects.
Wednesday, July 16, 2003
A bizarre new recreation in Las Vegas is provoking widespread outrage. An outfit called "Hunting for Bambi" apparently charges participants up to $10,000 to attempt to shoot a naked woman with a paintball gun as she tries to elude him in the desert near the city. Naturally, women's groups are up in arms. One psychologist suggested that some of the "hunters" may have trouble distinguishing fantasy from reality, and go on to perpetrate further violence against women.
I strongly suspect, however, that what's at work here is not so much misogyny as arrested development. Let's face it--a significant fraction of males of a certain age (perhaps a majority), if confronted with a naked female in the wilderness, would think nothing more fun than attempting to shoot her with a paintball gun.
Then puberty hits, and that attitude changes completely--for most boys, at least.
I strongly suspect, however, that what's at work here is not so much misogyny as arrested development. Let's face it--a significant fraction of males of a certain age (perhaps a majority), if confronted with a naked female in the wilderness, would think nothing more fun than attempting to shoot her with a paintball gun.
Then puberty hits, and that attitude changes completely--for most boys, at least.
Monday, July 14, 2003
Volokh Conspirator Tyler Cowen, whom I have criticized before, appears to be some kind of polar opposite to me: no sooner have I defended the music of Barry Manilow than Cowen is comparing it unfavorably with the "minimalist and conceptual art" at the Dia museum. Cowen gives an interesting list of conditions that, in his words, "seem to suffice to establish the merit of an artwork:"
My favorite example is Woody Allen, whose "serious" films were not so long ago--only a single pseudo-stepdaughter/wife ago, if I'm not mistaken--widely hailed as masterpieces of emotional and intellectual subtlety. Today, I doubt there's a self-important aesthete on the planet who even bothers to watch his stuff any more (apart, that is, from his early comedies, which really were masterpieces).
There are, of course, works of art that have been hailed as masterpieces by the cognoscenti for a century or more, despite evoking either indifference or outright revulsion from the vast majority of the public. (Think of late James Joyce, for instance, or early atonalist music.) I tend to think of those bodies of work as similar to the "cult classics" that manage to establish a small but often fanatically devoted and quite durable following. (H.P. Lovecraft comes to mind, or perhaps Ayn Rand.) They certainly deserve credit for something, but I'm not sure artistic genius is the right attribution.
In any event, the art Cowen describes hasn't even survived long enough to establish itself at that level. I therefore hope I will be forgiven for ignoring the scholars' breathless writeups and waiting at least another couple of intellectual cycles before concluding that minimalist and conceptual art is anything more than its worst critics (including myself) make of it.
1. More than just a few fansNow, I myself consider 1 and 5 to be the prime tests of artistic merit, and while the jury is out on most of Cowen's preferred art as well as mine by standard 5, Manilow easily has the better argument on count 1. Criteria 2 through 4, on the other hand, arouse enormous suspicion in me, because intellectual fashions have been known to cause sudden upsurges in enthusiasm--extremely thoughtful, well-articulated enthusiasm, mind you, from the most authoritative quarters--for bodies of work that subsequently sink into well-deserved obscurity and near-universal contempt.
2. Self-critical fans
3. Artistically well-educated, sophisticated fans
4. Fans who give an account of the art's importance and depth
5. Test of time
My favorite example is Woody Allen, whose "serious" films were not so long ago--only a single pseudo-stepdaughter/wife ago, if I'm not mistaken--widely hailed as masterpieces of emotional and intellectual subtlety. Today, I doubt there's a self-important aesthete on the planet who even bothers to watch his stuff any more (apart, that is, from his early comedies, which really were masterpieces).
There are, of course, works of art that have been hailed as masterpieces by the cognoscenti for a century or more, despite evoking either indifference or outright revulsion from the vast majority of the public. (Think of late James Joyce, for instance, or early atonalist music.) I tend to think of those bodies of work as similar to the "cult classics" that manage to establish a small but often fanatically devoted and quite durable following. (H.P. Lovecraft comes to mind, or perhaps Ayn Rand.) They certainly deserve credit for something, but I'm not sure artistic genius is the right attribution.
In any event, the art Cowen describes hasn't even survived long enough to establish itself at that level. I therefore hope I will be forgiven for ignoring the scholars' breathless writeups and waiting at least another couple of intellectual cycles before concluding that minimalist and conceptual art is anything more than its worst critics (including myself) make of it.
Sunday, July 13, 2003
Eugene Volokh is mightily incensed over "one of the most appalling judicial decisions [he's] ever seen." Apparently, the Supreme Court of the state of Nevada has ordered the state legislature to pass a budget that increases taxes, despite a provision in the state constitution that requires a two-thirds majority in both houses before a tax increase is passed. (The court cites a separate constitutional provision which requires education to be funded, although, as Volokh points out, that provision makes no mention of what level of funding is required.) Volokh furiously suggests that perhaps a recall of Nevada's Supreme Court justices is in order. Mickey Kaus agrees, and Glenn "Instapundit" Reynolds is appalled as well.
Now, it's certainly gratifying to see all these folks get so riled up over judicial overreaching. Heck, Volokh even refers to it as "judicial nullification of the people's will", noting that the constitutional clause ordered violated was the result of an amendment enacted by voter initiative in 1996. Still, I can't help wondering why this particular case of judicial arrogance has managed to strike such a chord with Volokh et al., when so many others have invited much more temperate criticism--or none at all.
Could it be the fact that in this case the courts were running roughshod over the state's constitution, and not the state's legislature, that has everyone so exercised? In other words, is it possible that the outrage flows more freely this time because it carries no implied defense of representative democracy?
Now, it's certainly gratifying to see all these folks get so riled up over judicial overreaching. Heck, Volokh even refers to it as "judicial nullification of the people's will", noting that the constitutional clause ordered violated was the result of an amendment enacted by voter initiative in 1996. Still, I can't help wondering why this particular case of judicial arrogance has managed to strike such a chord with Volokh et al., when so many others have invited much more temperate criticism--or none at all.
Could it be the fact that in this case the courts were running roughshod over the state's constitution, and not the state's legislature, that has everyone so exercised? In other words, is it possible that the outrage flows more freely this time because it carries no implied defense of representative democracy?
Friday, July 11, 2003
Matthew Yglesias gives some first-hand descriptions of stereotypical behavior on the part of tourists of various nationalities. While there is some truth to at least some of these stereotypes, the behavior of tourists is also greatly affected by (1) the type of tourists visiting a particular locale, and (2) the purpose of their visit. Hardworking, middle-aged non-cosmopolitan types on vacation to relax at the beach with their families are bound to act differently--and thus to reflect differently on their nationality--from, say, graduate students travelling in order to learn about foreign cultures. The same holds for teenagers or young adults vacationing from school or work--not to mention soldiers on leave.
For example, Canadians have a reputation in Europe for quiet politeness (particularly in comparison with their American cousins). That's probably because while Americans seem to view Europe as something of a theme park, Canadians are more likely to see travel there as an enlightening experience. On the other hand, I'm given to understand that in the Dominican Republic (a popular beach destination for budget-conscious Canadians) we're known as loud, obnoxious drunks.
The German and British reputations in Southern Europe no doubt have similar roots. I also once saw a busload of middle-aged French tourists in Seville, Spain, and from a distance I easily identified them--as American. Even up close, I had to listen to them to remind myself they weren't from the Midwest. I don't happen to know the fashionable destinations for broadminded travellers from these countries, but I would bet that wherever they go, their nationality has a pretty good reputation.
For example, Canadians have a reputation in Europe for quiet politeness (particularly in comparison with their American cousins). That's probably because while Americans seem to view Europe as something of a theme park, Canadians are more likely to see travel there as an enlightening experience. On the other hand, I'm given to understand that in the Dominican Republic (a popular beach destination for budget-conscious Canadians) we're known as loud, obnoxious drunks.
The German and British reputations in Southern Europe no doubt have similar roots. I also once saw a busload of middle-aged French tourists in Seville, Spain, and from a distance I easily identified them--as American. Even up close, I had to listen to them to remind myself they weren't from the Midwest. I don't happen to know the fashionable destinations for broadminded travellers from these countries, but I would bet that wherever they go, their nationality has a pretty good reputation.
Tuesday, July 08, 2003
An update to the posting below on biological terrorism as a possible more dangerous successor to computer hacking: Ed Felten recounts an interesting conversation he had with "a hotshot mo-bio professor" about just this comparison. "Each of us," he writes, "tried to reassure the other that really large-scale malicious attacks of the type we knew best (cyber- for me, bio- for him) were harder to carry out, and less likely, than commonly believed."
Certainly, it's reassuring that at least one top molecular biologist pooh-poohs the threat of molecular bioterrorism, considering it much more difficult than non-experts like me might expect. And I'm prepared to believe that right now, at least, the professor is right. (That would certainly explain the lack of successful large-scale attacks--despite the ominous example of the anthrax mailings of 2001.)
But then, bringing down the nation's information infrastructure with an Internet worm would have been extremely difficult, too--ten years ago, when very little of it was easily accessible via the Internet. Likewise, my fears center not on bioterrorists using today's techniques, but rather on the terrorists of twenty years from now, when genetic engineering technology may well have matured to the point where Henry C. Kelly's doomsday scenario involving "an inexperienced graduate student with a few thousand dollars worth of equipment" is plausible.
I know just how well-prepared computer scientists were a decade ago for the rise of Internet-related security threats--that is, not at all. And judging by the nonchalance of Felten's interlocutor, biotechnologists are in roughly the same state of denial. This time, though, it's our lives, not just our data, that are in danger.
Certainly, it's reassuring that at least one top molecular biologist pooh-poohs the threat of molecular bioterrorism, considering it much more difficult than non-experts like me might expect. And I'm prepared to believe that right now, at least, the professor is right. (That would certainly explain the lack of successful large-scale attacks--despite the ominous example of the anthrax mailings of 2001.)
But then, bringing down the nation's information infrastructure with an Internet worm would have been extremely difficult, too--ten years ago, when very little of it was easily accessible via the Internet. Likewise, my fears center not on bioterrorists using today's techniques, but rather on the terrorists of twenty years from now, when genetic engineering technology may well have matured to the point where Henry C. Kelly's doomsday scenario involving "an inexperienced graduate student with a few thousand dollars worth of equipment" is plausible.
I know just how well-prepared computer scientists were a decade ago for the rise of Internet-related security threats--that is, not at all. And judging by the nonchalance of Felten's interlocutor, biotechnologists are in roughly the same state of denial. This time, though, it's our lives, not just our data, that are in danger.
Having declared the Iranian theocracy dead, bloggers have now moved on to premature eulogies for the BBC. "The battle to reform the BBC is now in full swing," writes Andrew Sullivan. "The BBC is in trouble," notes Glenn "Instapundit" Reynolds, pointing to a London Times article that describes the network as "dangerously exposed".
Well, I'm hardly an expert on the BBC's political fortunes. But I'm a bit more familiar with the history of Canada's CBC, which has persevered through far, far greater ignominy than its British cousin. Its news and public affairs programming (heck, even its comedies and dramas) have long made the BBC's look politically neutral by comparison. It hasn't produced any content worth watching in decades, and consequently Canadians never, ever watch it. They never have to, because in addition to private Canadian Broadcasters, the overwhelming majority of Canadians have ready access the huge smorgasbord of American broadcasting, which does just about everything the CBC does, and much better. In fact, if it weren't for its excellent hockey broadcasts, most Canadians would simply have no clue what on earth the CBC portion of their tax bill is buying them.
Yet the CBC survives, and nobody believes it will disappear anytime soon. It has its own soapbox with which to defend itself, after all, and its passionate supporters on the outside tend also to be heavily microphone-equipped. Moreover, voters are hardly inclined to get all worked up about political bias on a network that nobody they know ever pays the slightest attention to.
If the BBC's opponents are smart, they will immediately stop griping about its alleged political transgressions, and make a case that the public can appreciate. What the anti-BBC forces need is a spate of artfully leaked stories about shocking profligacy in the BBC's corporate offices--the kind of rank irresponsibility in the handling of the license-holder's funds that calls for major cutbacks in license fees and a thorough, top-to-bottom audit of the network's finances.
The BBC being an old, bloated bureaucracy, it should be easy to find enough dirt to justify a massive budget squeeze, which will in turn reduce the number of jobs and contracts it can hand out, the quality and quantity of its output, and its overall prestige relative to private sector media organs. The resulting erosion in the effectiveness of its propaganda would be far greater than any amount of frustrated ranting about bias can ever hope to achieve.
One wonders--hasn't anyone in Tony Blair's office ever watched "Yes, Minister"?
Well, I'm hardly an expert on the BBC's political fortunes. But I'm a bit more familiar with the history of Canada's CBC, which has persevered through far, far greater ignominy than its British cousin. Its news and public affairs programming (heck, even its comedies and dramas) have long made the BBC's look politically neutral by comparison. It hasn't produced any content worth watching in decades, and consequently Canadians never, ever watch it. They never have to, because in addition to private Canadian Broadcasters, the overwhelming majority of Canadians have ready access the huge smorgasbord of American broadcasting, which does just about everything the CBC does, and much better. In fact, if it weren't for its excellent hockey broadcasts, most Canadians would simply have no clue what on earth the CBC portion of their tax bill is buying them.
Yet the CBC survives, and nobody believes it will disappear anytime soon. It has its own soapbox with which to defend itself, after all, and its passionate supporters on the outside tend also to be heavily microphone-equipped. Moreover, voters are hardly inclined to get all worked up about political bias on a network that nobody they know ever pays the slightest attention to.
If the BBC's opponents are smart, they will immediately stop griping about its alleged political transgressions, and make a case that the public can appreciate. What the anti-BBC forces need is a spate of artfully leaked stories about shocking profligacy in the BBC's corporate offices--the kind of rank irresponsibility in the handling of the license-holder's funds that calls for major cutbacks in license fees and a thorough, top-to-bottom audit of the network's finances.
The BBC being an old, bloated bureaucracy, it should be easy to find enough dirt to justify a massive budget squeeze, which will in turn reduce the number of jobs and contracts it can hand out, the quality and quantity of its output, and its overall prestige relative to private sector media organs. The resulting erosion in the effectiveness of its propaganda would be far greater than any amount of frustrated ranting about bias can ever hope to achieve.
One wonders--hasn't anyone in Tony Blair's office ever watched "Yes, Minister"?
Sunday, July 06, 2003
New Volokh Conspirator Randy Barnett ruminates on the philosophical implications of his having greatly enjoyed a concert by a Pink Floyd "cover band". Was it worse than the original recordings, because, though close, it wasn't quite an exact duplicate? Better, because it was a live concert, rather than a recording? Was it worse than a real Pink Floyd concert, because the musicians weren't the originals? Better, because the musicians were the right age to simulate the originals at the time of their greatest recordings?
The answer, of course, is, "it depends on what you're looking for in your aesthetic experience". Music, like all art, can be appreciated on multiple levels. Two of the most popular are the sensual/"pure aesthetic" and the social/symbolic. The first kind involves simply enjoying the pleasing sensory effects, while the second involves deriving satisfaction from the social context and symbolic content of the work. (The same goes for other pleasures, too--food and sex, for instance.)
For various reasons, popular art forms tend to be appreciated predominantly at the social/symbolic level. Pop music fans, for example, are attracted to an artist's style, image or "message" as much as to the actual sound of individual songs. Pop music concerts, certainly, are more about spectacle and crowd behavior than about the sound of the music (which is typically laughably inferior to what can be produced by a CD on a high-quality home sound system).
In the case of a "tribute" band like the one Barnett describes, the audience is obviously attending in order to experience a particular atmospheric effect--that of being at a concert by the original group. If it achieved that effect, then Barnett should be satisfied. If it was really the music itself that mattered to him, then he should have stayed home and listened to the CD.
Personally, I'm much more of a "pure" music aesthete; I've even been known to express appreciation for, say, the superb craftsmanship of a Barry Manilow song, ignoring the devastating social consequences of doing so. [*] But I'm in the minority, and I don't (well, don't necessarily) disparage those who look for something else in their music, such as "authenticity", or "an experience", or the faithful recreation of a long-passed cultural moment.
[*] I hasten to add that Manilow himself is a terrible singer, and didn't even write most of his hits. Still, some of them are true gems of pop schmaltz, if you're willing to consider them within their musical genre, rather than their social context.
The answer, of course, is, "it depends on what you're looking for in your aesthetic experience". Music, like all art, can be appreciated on multiple levels. Two of the most popular are the sensual/"pure aesthetic" and the social/symbolic. The first kind involves simply enjoying the pleasing sensory effects, while the second involves deriving satisfaction from the social context and symbolic content of the work. (The same goes for other pleasures, too--food and sex, for instance.)
For various reasons, popular art forms tend to be appreciated predominantly at the social/symbolic level. Pop music fans, for example, are attracted to an artist's style, image or "message" as much as to the actual sound of individual songs. Pop music concerts, certainly, are more about spectacle and crowd behavior than about the sound of the music (which is typically laughably inferior to what can be produced by a CD on a high-quality home sound system).
In the case of a "tribute" band like the one Barnett describes, the audience is obviously attending in order to experience a particular atmospheric effect--that of being at a concert by the original group. If it achieved that effect, then Barnett should be satisfied. If it was really the music itself that mattered to him, then he should have stayed home and listened to the CD.
Personally, I'm much more of a "pure" music aesthete; I've even been known to express appreciation for, say, the superb craftsmanship of a Barry Manilow song, ignoring the devastating social consequences of doing so. [*] But I'm in the minority, and I don't (well, don't necessarily) disparage those who look for something else in their music, such as "authenticity", or "an experience", or the faithful recreation of a long-passed cultural moment.
[*] I hasten to add that Manilow himself is a terrible singer, and didn't even write most of his hits. Still, some of them are true gems of pop schmaltz, if you're willing to consider them within their musical genre, rather than their social context.
Friday, July 04, 2003
It's been a long time since I've read anything as frightening as AFS president Henry C. Kelly's op-ed in the New York Times. "Within a few years," he writes, "it may be possible for an inexperienced graduate student with a few thousand dollars worth of equipment to download the gene structure of smallpox, insert sequences known to increase infectiousness or lethality, and produce enough material to threaten millions of people."
Now, I don't know much about molecular biology, and I don't know if the analogy is accurate, but this scenario sounds an awful lot like one I'm familiar with: computer hacking. In both cases, a system full of vulnerabilities is subject to scrutiny by thousands of imaginative (or simply persistent) attackers. But the worst computer hackers can do is destroy data, and perhaps disrupt communications and other infrastructure. Bio-hackers can potentially kill millions.
There are some differences, of course. A biological superbug would be much more dangerous to handle, and harder to test during development, than a computer attack. (An anthrax-like pathogen that doesn't spread from person to person would be about as easy to target as a computer hack, though.) The human immune system is generally more robust than computer systems, having evolved to deal with a wide variety of pathogens over millions of years. (Of course, it has never had to deal with human ingenuity before.)
The fundamental similarity, though, is that the technology massively favors the attacker, who can choose the time, place and method of attack, over the defender, who must try to react to protect entire populations before they're infected. Frankly, I don't like our odds.
Now, I don't know much about molecular biology, and I don't know if the analogy is accurate, but this scenario sounds an awful lot like one I'm familiar with: computer hacking. In both cases, a system full of vulnerabilities is subject to scrutiny by thousands of imaginative (or simply persistent) attackers. But the worst computer hackers can do is destroy data, and perhaps disrupt communications and other infrastructure. Bio-hackers can potentially kill millions.
There are some differences, of course. A biological superbug would be much more dangerous to handle, and harder to test during development, than a computer attack. (An anthrax-like pathogen that doesn't spread from person to person would be about as easy to target as a computer hack, though.) The human immune system is generally more robust than computer systems, having evolved to deal with a wide variety of pathogens over millions of years. (Of course, it has never had to deal with human ingenuity before.)
The fundamental similarity, though, is that the technology massively favors the attacker, who can choose the time, place and method of attack, over the defender, who must try to react to protect entire populations before they're infected. Frankly, I don't like our odds.
Thursday, July 03, 2003
In the Washington Post, David Ignatius heralds the "slow transformation of American innocence", as its "optimistic" culture gives way to a "sense of irony" in the wake of the ugly, messy reality of Iraq. It's hard to see, though, how a nation that has survived an astoundingly bloody civil war, two world wars, two costly far eastern interventions, and forty years of threatened nuclear annihilation would suddenly lose its sunny outlook because of a few dozen military casualties a month in the aftermath of a remarkably short, easy, painless victory. Undaunted, Ignatius argues that "Americans will have to develop a tragic sensibility to survive," emulating Iraq, "where it is always best to assume the worst."
Now, it is true that America's cheeriness is to some extent a product of America's good fortune--which could change at any time (although the Iraqi situation is hardly an indication that that's about to happen). But the converse is also at least partly true. It is surely no accident that Iraqis, who have for decades cynically assumed that they will wake up each morning cringing under the iron rule of a capriciously brutal dictator, have in fact done so, while Americans, who have expected--perhaps sometimes unrealistically--honest, upright, responsive leaders, come much closer to that goal. Culture consists primarily of a set of beliefs shared by consensus within a large population, and to the extent that cultures can be judged by their results, one would be foolish not to choose American naivete--however jarring it can sometimes be--over Iraqi cynicism.
Now, it is true that America's cheeriness is to some extent a product of America's good fortune--which could change at any time (although the Iraqi situation is hardly an indication that that's about to happen). But the converse is also at least partly true. It is surely no accident that Iraqis, who have for decades cynically assumed that they will wake up each morning cringing under the iron rule of a capriciously brutal dictator, have in fact done so, while Americans, who have expected--perhaps sometimes unrealistically--honest, upright, responsive leaders, come much closer to that goal. Culture consists primarily of a set of beliefs shared by consensus within a large population, and to the extent that cultures can be judged by their results, one would be foolish not to choose American naivete--however jarring it can sometimes be--over Iraqi cynicism.
Tuesday, July 01, 2003
Neophyte Volokh Conspirator Tyler Cowen's one-man campaign to lower the intellectual level of his new blog continues apace. His latest sophomoric argument, borrowed from some idol of his who apparently has a whole slew of such inanities, is that we're most likely all mere simulacra of human beings. The reasoning: suppose there's a non-negligible chance that humans will develop self-aware simulated humans (using, say, computer models) at some point. If we do, then we most likely will construct a huge number of them--until they vastly outnumber real, living humans. At that point, the odds are that any self-aware person (including, say, us) is actually a simulation, rather than a real person.
Of course, if we are all simulations, then there's nothing that prevents us, in the future, from creating innumerable "simulated simulations" of ourselves, given enough simulation time. We would thus prove that we're probably really just simulated simulations, which in turn can create simulations of themselves--and so on, ad infinitum. Presumably this argument has to stop somewhere, but there doesn't seem to be anything capable even of slowing it down, apart from the caprice of the simulators, should they decide to terminate their simulations out of boredom. (And if they're simulating my life, then their tolerance for boredom is clearly near-infinite.)
The problem is that by assuming that there will be lots and lots of simulated people, we are implicitly assuming that simulations of humans will be fairly cheap to generate and run. But simulating a human interacting with an entire world is bound to be an expensive proposition, if the simulation is to capture the world in anything approaching the detail and complexity of the original. On the other hand, the "game observer/directors" running the simulations may have no need for such detailed simulations--their particular motives (and one can think of all sorts of them) may well be satisfied by very crude simulations (say, of a large population of people with simplified intellects and emotional ranges, or full-fledged individual humans in a very simplified environment).
On the other hand, if the simulated world is a grossly simplified version of the real one, then it is foolish to try to predict anything about the latter's long-term evolution (let alone its ultimate technological trajectory) based on the simulation. In fact, it's likely impossible even to make reliable inferences of any kind about the simulators from inside the simulation. Perhaps they're merely conducting fanciful "what if?" simulations ("imagine that we had only two arms and two legs, instead of the usual complement of four of each...."), or running abstract, idealized experiments ("consider a randomly constructed species with adult cognitive capacity equivalent to a normal Xrnapthian toddler..."). In these cases, we can't even begin to speculate fruitfully on the likelihood of the simulators producing simulated worlds like ours, or simulated inhabitants like us. If we are but simulations, then the minds of the G. O/D's are forever hidden from our understanding.
I must say, though, that it feels somewhat odd pondering this kind of argument these days. After all, I don't think I've set foot in a college dorm room for over a decade and a half.
Of course, if we are all simulations, then there's nothing that prevents us, in the future, from creating innumerable "simulated simulations" of ourselves, given enough simulation time. We would thus prove that we're probably really just simulated simulations, which in turn can create simulations of themselves--and so on, ad infinitum. Presumably this argument has to stop somewhere, but there doesn't seem to be anything capable even of slowing it down, apart from the caprice of the simulators, should they decide to terminate their simulations out of boredom. (And if they're simulating my life, then their tolerance for boredom is clearly near-infinite.)
The problem is that by assuming that there will be lots and lots of simulated people, we are implicitly assuming that simulations of humans will be fairly cheap to generate and run. But simulating a human interacting with an entire world is bound to be an expensive proposition, if the simulation is to capture the world in anything approaching the detail and complexity of the original. On the other hand, the "game observer/directors" running the simulations may have no need for such detailed simulations--their particular motives (and one can think of all sorts of them) may well be satisfied by very crude simulations (say, of a large population of people with simplified intellects and emotional ranges, or full-fledged individual humans in a very simplified environment).
On the other hand, if the simulated world is a grossly simplified version of the real one, then it is foolish to try to predict anything about the latter's long-term evolution (let alone its ultimate technological trajectory) based on the simulation. In fact, it's likely impossible even to make reliable inferences of any kind about the simulators from inside the simulation. Perhaps they're merely conducting fanciful "what if?" simulations ("imagine that we had only two arms and two legs, instead of the usual complement of four of each...."), or running abstract, idealized experiments ("consider a randomly constructed species with adult cognitive capacity equivalent to a normal Xrnapthian toddler..."). In these cases, we can't even begin to speculate fruitfully on the likelihood of the simulators producing simulated worlds like ours, or simulated inhabitants like us. If we are but simulations, then the minds of the G. O/D's are forever hidden from our understanding.
I must say, though, that it feels somewhat odd pondering this kind of argument these days. After all, I don't think I've set foot in a college dorm room for over a decade and a half.
Subscribe to:
Posts (Atom)