Paul Krugman's undignified descent from distinguished economist to respectable economics-popularizer to partisan hack is already well-documented (by me and others), but even so, his most recent column must be read to be believed. In it, he states that (I'm not making this up) "[f]or most of the last 50 years, public policy took it for granted that media bias was a potential problem"; that "[t]he answer was a combination of regulation and informal guidelines"; and that because "much of that system has been dismantled", today "we have a situation rife with conflicts of interest", in which a "handful of organizations that supply most people with their news have major commercial interests that inevitably tempt them to slant their coverage, and more generally to be deferential to the ruling party".
Memo to Krugman: If you're going to accuse "major news outlets" of being "inevitably....deferential to the ruling party", it would help (1) not to do so in prompt, slavish imitation of recent remarks by Democratic party leaders Tom Daschle and Al Gore; (2) not to cite as your "most important example" a network (Fox News) whose nightly viewership is less than the daily circulation of the newspaper that carries your own twice-weekly, virulently anti-Republican column (to say nothing of all the other newspapers that publish your screeds as New York Times News Service features); and (3) not to propose government regulation as a solution at a time when the ruling party you so passionately rail against has just won complete control of the elected branches of the federal government (and hence, presumably, of any regulatory process that might be established).
Just a suggestion.....
Friday, November 29, 2002
Wednesday, November 27, 2002
Pharmaceuticals heiress Ruth Lilly has donated $100 million to Poetry Magazine, but Meghan O'Rourke, in Slate, and Eric Gibson, in the Wall Street Journal, both express their doubts about the gift's potential to rejuvenate the muse, described by poet Dana Gioia (whom Gibson quotes) as "the specialized occupation of a relatively small and isolated group". It should come as no surprise, though, that both articles, in bemoaning the obscurity and unpopularity of poetry, somehow failed to mention the one word that would have given the lie to their effete lugubriousness. That word, of course, is "rap"; and compared to its multibillion-dollar market, Ms. Lilly's paltry nine-figure bequest is a tiny irrelevancy in the world of poetry.
Now, I don't mean to claim that the major hiphop artists of our day are all creating poetic masterpieces. But then, neither were the greats of English-language poetry all living in a pristine world, blissfully free of the plague of mediocre doggerel. On the contrary, their work stood out precisely because it existed in the context of a living, even popular art form whose typical examples were, in retrospect, unmemorable or worse. Many of the giants achieved acclaim in their own day; some toiled in obscurity, only to be appreciated much later. But until Matthew Arnold popularized the idea of art as something that an audience needed to be taught to appreciate--setting the stage for the devastating schism between popular and "high" art that has left forms like poetry in such a sorry state today--none of the greats would have shied away from comparisons of their work with that of their most fashionable, most popular (and, they would confidently have asserted, self-evidently inferior) contemporaries.
But today's poets don't want to face such comparisons; rather, they consider themselves to be doing something entirely different from their pop counterparts--even as they reject any restrictions or boundaries on the form or content of their own work. They hide behind their illustrious claimed predecessors because they are naked. They tartly note the cavernous gap in artistic mastery that separates P. Shelley from P. Diddy--glossing over their own conspicuous inability to bridge that gap themselves. They may not be as popular as rap performers, they sniff, but, more importantly, they are appreciated by--who? The 12,000-odd readers of Poetry Magazine?
Perhaps what poetry needs is an equivalent to jazz music: a popular form that intellectuals can respect, that shows up classical music's supposed descendants, the practitioners of "modern serious music", for the masturbatory noisemakers they are, and that isn't afraid to interact with, influence and be influenced by a mass audience and its cacophony of overlapping mass tastes. Rap may be empty drivel, after all, but the next Tennyson is far more likely to arise from its crowd-pleasing dynamism than from a few insular scribblers publishing obscurantist verses for tiny audiences and pathetically imagining themselves to be writing for the ages.
Now, I don't mean to claim that the major hiphop artists of our day are all creating poetic masterpieces. But then, neither were the greats of English-language poetry all living in a pristine world, blissfully free of the plague of mediocre doggerel. On the contrary, their work stood out precisely because it existed in the context of a living, even popular art form whose typical examples were, in retrospect, unmemorable or worse. Many of the giants achieved acclaim in their own day; some toiled in obscurity, only to be appreciated much later. But until Matthew Arnold popularized the idea of art as something that an audience needed to be taught to appreciate--setting the stage for the devastating schism between popular and "high" art that has left forms like poetry in such a sorry state today--none of the greats would have shied away from comparisons of their work with that of their most fashionable, most popular (and, they would confidently have asserted, self-evidently inferior) contemporaries.
But today's poets don't want to face such comparisons; rather, they consider themselves to be doing something entirely different from their pop counterparts--even as they reject any restrictions or boundaries on the form or content of their own work. They hide behind their illustrious claimed predecessors because they are naked. They tartly note the cavernous gap in artistic mastery that separates P. Shelley from P. Diddy--glossing over their own conspicuous inability to bridge that gap themselves. They may not be as popular as rap performers, they sniff, but, more importantly, they are appreciated by--who? The 12,000-odd readers of Poetry Magazine?
Perhaps what poetry needs is an equivalent to jazz music: a popular form that intellectuals can respect, that shows up classical music's supposed descendants, the practitioners of "modern serious music", for the masturbatory noisemakers they are, and that isn't afraid to interact with, influence and be influenced by a mass audience and its cacophony of overlapping mass tastes. Rap may be empty drivel, after all, but the next Tennyson is far more likely to arise from its crowd-pleasing dynamism than from a few insular scribblers publishing obscurantist verses for tiny audiences and pathetically imagining themselves to be writing for the ages.
Monday, November 25, 2002
Federal Reserve Governor Ben S. Bernanke made an interesting speech last week, asserting the Fed's imperative to fight deflation by any means necessary. He also pointed out what is generally considered a basic economic fact:
Well, perhaps Japan is an exceptional case--its savings rate is astronomical, the economy lacks transparency, the culture is inscrutable, etc. etc. etc. But there's another puzzling exception that hasn't been mentioned much in this context. It's a bit like Sherlock Holmes' famous "incident of the dog in the night" that didn't bark, and hence passed unnoticed despite its oddity. I refer to the case of the US over the last five years.
From October 1997 to October 2002, broad money supply (M2) in the US rose more than 44 percent; the broadest measure, M3, rose 55 percent. These are rates not seen since the early eighties, when inflation was much higher. The printing presses seem to have been running full tilt, but inflation has still been low enough to prompt worries of impending deflation. Where has all the money gone?
It turns out that more money does not always create more spending. As Stephen Roach of Morgan Stanley points out, "[newly generated] money must go somewhere, but initially it might be channeled into balance sheet repair, paying down debt, or a restoration of saving before it ends up in the real economy or the price structure. There are no guarantees of instant policy traction near a zero rate of inflation." This is particularly true of post-bubble economies like the current American one, where indebtedness is high, savings are low (and hence have much room to rise), and an investment-minded culture pumped full of cash may be perfectly happy to pour the surplus funds into yet another asset bubble (some suggest that urban real estate may already be playing that role, now that high-tech stocks have lost their luster). It would be comforting to think that Japan's malady is sui generis, and can be easily cured should it occur elsewhere. I just wish I were convinced....
[T]he U.S. government has a technology, called a printing press (or, today, its electronic equivalent), that allows it to produce as many U.S. dollars as it wishes at essentially no cost. By increasing the number of U.S. dollars in circulation, or even by credibly threatening to do so, the U.S. government can also reduce the value of a dollar in terms of goods and services, which is equivalent to raising the prices in dollars of those goods and services. We conclude that, under a paper-money system, a determined government can always generate higher spending and hence positive inflation.Now, this would appear to be simple, inarguable common economic sense. But there are a couple of seeming exceptions that might give one pause. First there's Japan: after years of budget-busting Keynesian fiscal stimulus, the country is still mired in a deflationary slump, and the government--printing press and all--appears helpless to do anything about it. Many Japan commentators claim that it's bureaucratic incompetence, not lack of available means, that's hamstringing the government and preventing the necessary economic repairs, but the measures these pundits typically advocate--such as reforming the banking system, and allowing large but hopelessly debt-ridden concerns and banks with humongous portfolios of nominally-performing-but-effectively-deadbeat loans to go under--sound like they have nothing to do with simply running the presses and printing away deflation. Why, then, aren't they (apart from Paul Krugman, who has been pushing the inflation solution to Japan's ills for years now) all just advocating the obvious?
Well, perhaps Japan is an exceptional case--its savings rate is astronomical, the economy lacks transparency, the culture is inscrutable, etc. etc. etc. But there's another puzzling exception that hasn't been mentioned much in this context. It's a bit like Sherlock Holmes' famous "incident of the dog in the night" that didn't bark, and hence passed unnoticed despite its oddity. I refer to the case of the US over the last five years.
From October 1997 to October 2002, broad money supply (M2) in the US rose more than 44 percent; the broadest measure, M3, rose 55 percent. These are rates not seen since the early eighties, when inflation was much higher. The printing presses seem to have been running full tilt, but inflation has still been low enough to prompt worries of impending deflation. Where has all the money gone?
It turns out that more money does not always create more spending. As Stephen Roach of Morgan Stanley points out, "[newly generated] money must go somewhere, but initially it might be channeled into balance sheet repair, paying down debt, or a restoration of saving before it ends up in the real economy or the price structure. There are no guarantees of instant policy traction near a zero rate of inflation." This is particularly true of post-bubble economies like the current American one, where indebtedness is high, savings are low (and hence have much room to rise), and an investment-minded culture pumped full of cash may be perfectly happy to pour the surplus funds into yet another asset bubble (some suggest that urban real estate may already be playing that role, now that high-tech stocks have lost their luster). It would be comforting to think that Japan's malady is sui generis, and can be easily cured should it occur elsewhere. I just wish I were convinced....
Thursday, November 21, 2002
The academic blogosphere--including Eugene Volokh, Jacob Levy, and Glenn "Instapundit" Reynolds--is up in arms over threats to freedom of speech on college campuses. "[T]he regulation of merely offensive speech in classroom settings is an utterly noxious idea," writes Levy, and the rest resoundingly agree. I admit to being a trifle confused; my understanding was that the whole point of universities is that a student whose speech--in classroom presentations, on exam papers, in course assignments--is not even offensive but merely insufficiently scholarly can face penalties as severe as expulsion. Have things changed that much since I went to school?
Actually, not that much; on-campus ranting about free speech was popular back in my day, as well (although in Canada, where I studied, the rhetoric was always much more muted and less indignant). Now, I recognize the important role that free speech plays in a healthy democracy, as well as the dangers inherent in limiting free expression in society as a whole. But universities are not society as a whole, nor are they even democracies. They are institutions (ostensibly) dedicated to education and research, whose members voluntarily forgo all sorts of freedoms (such as the freedom to neglect one's education and the freedom to do shoddy research) for the sake of furthering the academic community's (and hopefully their own, similarly aligned) goals. And it would seem obvious that, say, broadly offensive speech (or, for that matter, false or even illucid speech) would often work to the detriment of these goals, by undermining reasoned, dispassionate debate.
Of course, the common response of free speech defenders to such observations is that university faculties and administrations, if given the power, would use speech restrictions to stifle legitimate political debate on campus. And it's certainly possible that some, or even many, might do so. Then again, those same administrations can easily use their power to grant or deny tenure, to fund departments, to admit or reject and graduate or fail students, and so on to exactly the same nefarious ends. And these instruments are scarcely less potent, in ruthless hands, than the right to make rules about, say, offensive language. It's hard to argue that, say, banning racial epithets on a campus has anything like the chilling effect of, say, requiring students to pass a course in which grades are given for writings and presentations based on their political content.
In fact, too much free expression has sometimes threatened the academic health of universities as seriously as too little of it. Thuggish behavior on campus--shouting down of speakers, destruction of leaflets or newspapers, even physically threatening behavior--often masquerades as "protest", with its perpetrators demanding absolute protection from punishment in the name of "free speech". The endless chanting of the free-speech mantra is thus a pitifully ineffective substitute for vigorous action to protect the scholarly collegiality of the modern academic environment.
Sadly, few academic leaders--let alone students or other citizens--take notions like "the scholarly collegiality of the modern academic environment" the slightest bit seriously these days. As I have written before, the modern liberal arts university is an institution adrift, bereft of serious purpose, and thus at the mercy of interest groups keen to hijack it to advance their own goals. In the hands of these groups, the "free speech" slogan is just another rhetorical bludgeon with which to pummel the university into submission; in the absence of serious defenders, or even a serious alternative vision, the university is helpless to defend itself.
In 1993, when University of Pennsylvania student Eden Jacobowitz was punished for shouting an insult (thought by some to be racist) at a group of African-American women who were celebrating loudly outside his dorm window, disturbing his (and his dormmates') studies, he became an instant martyr to the cause of free speech. Some decried him as a bigot; most deplored UPenn's persecution of him. Nowhere was the slightest attention paid to the real lesson of the story: that the University of Pennsylvania simply did not care about whether some students' raucous behavior might be disrupting their fellow students' efforts to study.
Then again, by 1993 the idea that a university might think to value studiousness over partying had long been relegated to academia's distant past. After all, such a policy might interfere with "free speech".
Actually, not that much; on-campus ranting about free speech was popular back in my day, as well (although in Canada, where I studied, the rhetoric was always much more muted and less indignant). Now, I recognize the important role that free speech plays in a healthy democracy, as well as the dangers inherent in limiting free expression in society as a whole. But universities are not society as a whole, nor are they even democracies. They are institutions (ostensibly) dedicated to education and research, whose members voluntarily forgo all sorts of freedoms (such as the freedom to neglect one's education and the freedom to do shoddy research) for the sake of furthering the academic community's (and hopefully their own, similarly aligned) goals. And it would seem obvious that, say, broadly offensive speech (or, for that matter, false or even illucid speech) would often work to the detriment of these goals, by undermining reasoned, dispassionate debate.
Of course, the common response of free speech defenders to such observations is that university faculties and administrations, if given the power, would use speech restrictions to stifle legitimate political debate on campus. And it's certainly possible that some, or even many, might do so. Then again, those same administrations can easily use their power to grant or deny tenure, to fund departments, to admit or reject and graduate or fail students, and so on to exactly the same nefarious ends. And these instruments are scarcely less potent, in ruthless hands, than the right to make rules about, say, offensive language. It's hard to argue that, say, banning racial epithets on a campus has anything like the chilling effect of, say, requiring students to pass a course in which grades are given for writings and presentations based on their political content.
In fact, too much free expression has sometimes threatened the academic health of universities as seriously as too little of it. Thuggish behavior on campus--shouting down of speakers, destruction of leaflets or newspapers, even physically threatening behavior--often masquerades as "protest", with its perpetrators demanding absolute protection from punishment in the name of "free speech". The endless chanting of the free-speech mantra is thus a pitifully ineffective substitute for vigorous action to protect the scholarly collegiality of the modern academic environment.
Sadly, few academic leaders--let alone students or other citizens--take notions like "the scholarly collegiality of the modern academic environment" the slightest bit seriously these days. As I have written before, the modern liberal arts university is an institution adrift, bereft of serious purpose, and thus at the mercy of interest groups keen to hijack it to advance their own goals. In the hands of these groups, the "free speech" slogan is just another rhetorical bludgeon with which to pummel the university into submission; in the absence of serious defenders, or even a serious alternative vision, the university is helpless to defend itself.
In 1993, when University of Pennsylvania student Eden Jacobowitz was punished for shouting an insult (thought by some to be racist) at a group of African-American women who were celebrating loudly outside his dorm window, disturbing his (and his dormmates') studies, he became an instant martyr to the cause of free speech. Some decried him as a bigot; most deplored UPenn's persecution of him. Nowhere was the slightest attention paid to the real lesson of the story: that the University of Pennsylvania simply did not care about whether some students' raucous behavior might be disrupting their fellow students' efforts to study.
Then again, by 1993 the idea that a university might think to value studiousness over partying had long been relegated to academia's distant past. After all, such a policy might interfere with "free speech".
Tuesday, November 19, 2002
Andrew Sullivan is baffled that supposedly tough soldiers seem so skittish about the idea of accepting the presence of openly gay men in their midst. "Is it because they're afraid of being raped?", he asks. "C'mon. Assuming all gay men - or even any - are potential rapists is completely loopy. (And the same people who make this bizarre argument would scoff at a woman who screamed rape if a man looked at her in a sexually interested way.)"
Oddly enough, Sullivan doesn't evince the slightest mystification over the elaborate lengths to which the military goes to protect women--tough, hardened military women, mind you--from invasions of their privacy by men. Why aren't military showers, bunks and latrines co-ed? And why all the draconian rules against "fraternization" and so on? Are these women soldiers afraid of being raped by their well-disciplined colleagues? Are they afraid of being looked at by men with lust in their hearts?
Well, yes, actually--and understandably so. One doesn't have to believe that all straight men are rapists, or that the male sexual gaze is inherently brutalizing, to understand why women (in this culture, at least) feel unsafe bathing naked around male soldiers (or groups of them). It doesn't matter if most of the time, nothing untoward happens; it only takes one major incident--or a long-enough sequence of small, subtle, incremental steps--for all assumptions of safety to break down completely. (If you're a weakling to be bothered by glances, after all, then what about playful pats on the shoulder? Or elsewhere? Where does the line get drawn? How? And by whom?)
The instinctive anticipation of this threat of sudden breach or gradual erosion of personal safety is likely the source of that general feeling of discomfort that causes women to want to guard their privacy from men when forced into close quarters with them. It shouldn't be surprising, then, that men would want to take the same precautions with respect to gay men; after all, gay men may not be substantially worse than straight men in this regard, but there's no reason to believe they're any better (and, as straight men themselves are well aware, that's plenty bad enough, in the worst cases).
Of course, military training is designed to break down instinctive anticipations and general feelings of discomfort and replace them with rigorous discipline; and if it turned out one day to be militarily necessary to drill soldiers to get over their discomfort around gay comrades the way they get over, say, terror of enemy fire, then the army would no doubt do what had to be done. But each such psychic hardship imposed on soldiers exacts a toll, and for the military to wish to avoid an avoidable one, so as to be able to concentrate on the unavoidable ones, is hardly a demonstration of bigotry or cowardice. Rather, it demonstrates a recognition of, and respect for, the limits and costs of discipline, and an unwillingness to bury or deny those costs in order to indulge various varieties of political dogma.
Oddly enough, Sullivan doesn't evince the slightest mystification over the elaborate lengths to which the military goes to protect women--tough, hardened military women, mind you--from invasions of their privacy by men. Why aren't military showers, bunks and latrines co-ed? And why all the draconian rules against "fraternization" and so on? Are these women soldiers afraid of being raped by their well-disciplined colleagues? Are they afraid of being looked at by men with lust in their hearts?
Well, yes, actually--and understandably so. One doesn't have to believe that all straight men are rapists, or that the male sexual gaze is inherently brutalizing, to understand why women (in this culture, at least) feel unsafe bathing naked around male soldiers (or groups of them). It doesn't matter if most of the time, nothing untoward happens; it only takes one major incident--or a long-enough sequence of small, subtle, incremental steps--for all assumptions of safety to break down completely. (If you're a weakling to be bothered by glances, after all, then what about playful pats on the shoulder? Or elsewhere? Where does the line get drawn? How? And by whom?)
The instinctive anticipation of this threat of sudden breach or gradual erosion of personal safety is likely the source of that general feeling of discomfort that causes women to want to guard their privacy from men when forced into close quarters with them. It shouldn't be surprising, then, that men would want to take the same precautions with respect to gay men; after all, gay men may not be substantially worse than straight men in this regard, but there's no reason to believe they're any better (and, as straight men themselves are well aware, that's plenty bad enough, in the worst cases).
Of course, military training is designed to break down instinctive anticipations and general feelings of discomfort and replace them with rigorous discipline; and if it turned out one day to be militarily necessary to drill soldiers to get over their discomfort around gay comrades the way they get over, say, terror of enemy fire, then the army would no doubt do what had to be done. But each such psychic hardship imposed on soldiers exacts a toll, and for the military to wish to avoid an avoidable one, so as to be able to concentrate on the unavoidable ones, is hardly a demonstration of bigotry or cowardice. Rather, it demonstrates a recognition of, and respect for, the limits and costs of discipline, and an unwillingness to bury or deny those costs in order to indulge various varieties of political dogma.
In Ha'aretz, Danny Rubinstein argues that Palestinians would actually prefer a right-wing victory in the upcoming elections, because Sharon "did not succeed in reducing the violence or stopping the terrorist attacks", and his continued rule "is the only way Israelis will learn how powerless the right really is - and may in turn germinate the seeds of a just settlement." He may well be correct; the problem arises when one considers just how the Palestinians define a "just settlement". After all, if the definition looked anything like the permanent settlement Ehud Barak offered--or even the unilateral withdrawal being proposed by Labor leadership candidate Amram Mitzna--then surely the Palestinians would be rooting for a Mitzna victory. If Rubinstein is right, then, the Palestinian notion of a "just settlement" must be a much more far-reaching capitulation than even Labor doves are willing to contemplate--that is, something that most Israelis (understandably) consider tantamount to acquiescing in Israel's complete destruction.
I asserted a few months ago that according to "the solid majority view" among Israelis, they are already "the undeclared winning side in the conflict" of the last two years with the Palestinians. But there is a subtle assumption buried in that view: that Israel can continue to impose and even tighten its crackdown on the occupied territories indefinitely, until its residents eventually stop seeing their suffering as a worthwhile price to pay for maintaining their low-level campaign of terrorism against Israel. This assumption may be correct; but it's also possible that the current willingness of (according to polls) a majority of Palestinians to endure hardship of the worst sort, just for the sake of persevering in their efforts to kill as many Jews as possible, will continue for years to come. In that case, Israel faces a long period of walking on an extremely slippery tightrope between, on the one hand, indulging the temptation to resort to extreme cruelty in an attempt to hasten the moment of Palestinian abandonment of terrorism, and on the other, indulging the temptation to forget--as so many Israelis did from 1993 to 2000--that conciliatory concessions to a polity that enthusiastically embraces mass murder are ultimately suicidal.
I asserted a few months ago that according to "the solid majority view" among Israelis, they are already "the undeclared winning side in the conflict" of the last two years with the Palestinians. But there is a subtle assumption buried in that view: that Israel can continue to impose and even tighten its crackdown on the occupied territories indefinitely, until its residents eventually stop seeing their suffering as a worthwhile price to pay for maintaining their low-level campaign of terrorism against Israel. This assumption may be correct; but it's also possible that the current willingness of (according to polls) a majority of Palestinians to endure hardship of the worst sort, just for the sake of persevering in their efforts to kill as many Jews as possible, will continue for years to come. In that case, Israel faces a long period of walking on an extremely slippery tightrope between, on the one hand, indulging the temptation to resort to extreme cruelty in an attempt to hasten the moment of Palestinian abandonment of terrorism, and on the other, indulging the temptation to forget--as so many Israelis did from 1993 to 2000--that conciliatory concessions to a polity that enthusiastically embraces mass murder are ultimately suicidal.
Sunday, November 17, 2002
NEW YORK (ICBW) -- In the wake of the spectacular opening-weekend success of the latest Harry Potter film, "Harry Potter and the Chamber of Secrets", authorities are bracing for the likely consequence: a spate of children injuring themselves while imitating Potter's magic feats. "We want to warn the public that magic is a dangerous business," said Albus Dumbledore, headmaster of Hogwarts Academy of Witchcraft and Wizardry, and a technical consultant to the filmmakers. "Untrained muggle children shouldn't even attempt to dabble in it."
After the release of the first film in the series, "Harry Potter and the Sorcerer's Stone", numerous young viewers sustained injuries uttering backfired spells and playing quidditch with dangerously underpowered homemade brooms. This time, there have already been scattered reports of flying-car accidents and careless petrifications. "The magic stunts performed in the film all involved qualified Hogwarts-trained professionals," explained Dumbledore. "But many youngsters see a group of child actors appearing to use powerful spells and potions, and figure, 'hey, I can do that.'"
In preparation for the film's opening, hospitals throughout the US have stocked up on mandrake root and phoenix tears, and Dumbledore said his staff will be available around the clock to handle emergencies. "But the best precaution," he reminds viewers, "is to stick to non-magical pursuits. It's funny, really--if our own students had a choice, most of them would neglect their magic completely and spend all day playing video games and chattering on their blasted cellphones."
After the release of the first film in the series, "Harry Potter and the Sorcerer's Stone", numerous young viewers sustained injuries uttering backfired spells and playing quidditch with dangerously underpowered homemade brooms. This time, there have already been scattered reports of flying-car accidents and careless petrifications. "The magic stunts performed in the film all involved qualified Hogwarts-trained professionals," explained Dumbledore. "But many youngsters see a group of child actors appearing to use powerful spells and potions, and figure, 'hey, I can do that.'"
In preparation for the film's opening, hospitals throughout the US have stocked up on mandrake root and phoenix tears, and Dumbledore said his staff will be available around the clock to handle emergencies. "But the best precaution," he reminds viewers, "is to stick to non-magical pursuits. It's funny, really--if our own students had a choice, most of them would neglect their magic completely and spend all day playing video games and chattering on their blasted cellphones."
Thursday, November 14, 2002
Mark Kleiman writes about the harrowing story of thimerosal, a mercury-based vaccine additive that some suspect is responsible for the epidemic of autism that appears to have broken out in California in the 1990s. Republican Congressman Dick Armey has slipped a provision into the new Homeland Security bill that, according to the New York Times, "was apparently intended to protect Eli Lilly, the pharmaceutical giant, from lawsuits over thimerosal". All very shocking--until, that is, one checks out the CDC's position on thimerosal....
Of course, I'm no toxicology expert, and if anybody can point me to actual, substantial evidence implicating thimerosal in vaccines as a cause of serious health problems, I'd be interested to hear about it. Perhaps, though, the CDC ought to be informed first.
There is no evidence of harm caused by the minute doses of thimerosal in vaccines, except for minor effects like swelling and redness at the injection site due to sensitivity to thimerosal.Now, I hold no brief for Dick Armey, and I don't care at all for special-favor clauses being sneaked into important legislation. But in the absence of proper tort reform (and in the presence of widespread hysteria about technology--including lifesaving technologies like vaccines), I would guess that it's more likely that this particular political move will end up saving lives (by impeding the onslaught of tort lawyers and junk-science scaremongers on the practice of universal vaccination) than that it will actually harm anyone.
Of course, I'm no toxicology expert, and if anybody can point me to actual, substantial evidence implicating thimerosal in vaccines as a cause of serious health problems, I'd be interested to hear about it. Perhaps, though, the CDC ought to be informed first.
Tuesday, November 12, 2002
The campaign to get universities to divest themselves of Israeli investments is heating up; both supporters and opponents are comparing it to the South Africa divestment campaign of the 1980's. And both sides are more right than they realize.
Of course, Israel is nothing like South Africa was. It's a full democracy with a universal franchise, not at all like the apartheid regime. It has recently engaged in a multi-year process of trying to set up the territories it occupies (as a result of a war provoked by legitimate casus belli) as an independent state, being stymied only by the refusal of the prospective government of that state to abandon terrorism against Israelis. Its own citizens are neither racially segregated nor otherwise politically oppressed, and have the full range of democratic freedoms, including speech and religion.
But then, South Africa wasn't a particularly obvious choice of target, either. It was hardly the worst human rights abuser of the era, even on its own continent. Opponents of the boycott routinely pointed out that South African Blacks were better off than they would have been in just about any other country in Africa, and the boycott itself caused no small amount of suffering among them. If one were to choose a political evil to target in the 1980's based on moral and humanitarian considerations, Apartheid would have been a legitimate but relatively minor choice, paling by comparison with literally dozens of others.
But that's the dirty little secret of politically motivated boycotts: they are not primarily chosen on the strength of their justifications or the urgency of their goals. Rather, their adherents participate in the hope of making a political point in some other, entirely separate context. The South African boycott, in truth, was about many things--race relations in the US, Cold War geopolitics in the Third World, and anti-corporate populism, to name three--but the actual conditions of non-White South Africans were at best peripheral. Likewise, today's university divestment campaigns have many motivations--"anti-globalist" leftism, anti-Americanism, even, on the fringes, some anti-Semitism--but sincere concern for the plight of the Palestinians (for the vast majority of whom the Oslo process has been an unmitigated disaster that a boycott of Israel would likely only further exacerbate) can't be very high on the list. It is fortunate that some major academic leaders are seeing through the sophistries and rejecting the divestment movement's meretricious moral case.
Of course, Israel is nothing like South Africa was. It's a full democracy with a universal franchise, not at all like the apartheid regime. It has recently engaged in a multi-year process of trying to set up the territories it occupies (as a result of a war provoked by legitimate casus belli) as an independent state, being stymied only by the refusal of the prospective government of that state to abandon terrorism against Israelis. Its own citizens are neither racially segregated nor otherwise politically oppressed, and have the full range of democratic freedoms, including speech and religion.
But then, South Africa wasn't a particularly obvious choice of target, either. It was hardly the worst human rights abuser of the era, even on its own continent. Opponents of the boycott routinely pointed out that South African Blacks were better off than they would have been in just about any other country in Africa, and the boycott itself caused no small amount of suffering among them. If one were to choose a political evil to target in the 1980's based on moral and humanitarian considerations, Apartheid would have been a legitimate but relatively minor choice, paling by comparison with literally dozens of others.
But that's the dirty little secret of politically motivated boycotts: they are not primarily chosen on the strength of their justifications or the urgency of their goals. Rather, their adherents participate in the hope of making a political point in some other, entirely separate context. The South African boycott, in truth, was about many things--race relations in the US, Cold War geopolitics in the Third World, and anti-corporate populism, to name three--but the actual conditions of non-White South Africans were at best peripheral. Likewise, today's university divestment campaigns have many motivations--"anti-globalist" leftism, anti-Americanism, even, on the fringes, some anti-Semitism--but sincere concern for the plight of the Palestinians (for the vast majority of whom the Oslo process has been an unmitigated disaster that a boycott of Israel would likely only further exacerbate) can't be very high on the list. It is fortunate that some major academic leaders are seeing through the sophistries and rejecting the divestment movement's meretricious moral case.
Monday, November 11, 2002
The Canadian government's occasionally impolitic positions with respect to the war on terrorism, to which I've alluded previously, and which are the subject of a piece by Jonah Goldberg in the National Review, may be somewhat puzzling to Americans. I will try to explain (without excusing) them; the explanation may also provide useful insight into some other Western countries' strangely unsupportive attitude towards American efforts against terrorism.
The first thing that Americans should understand about Canadian politics is that Canadians are, by and large, a politically uncommitted bunch. Only a tiny fraction of the population belongs to a political party, and most of the rest are happy to vote for whichever party seems to be addressing the pocketbook issues of the day (or to be winning handily enough to be worth currying favor with, in the hopes of receiving a greater share of federal pork-barrel spending after the election). To the extent that there is any mass partisanship in federal politics, it is largely a matter of inter-regional conflict, with parties increasingly representing their regional power bases. Canadian foreign policy is simply not on the political radar screen, as Canadian voters understand perfectly well their country's utter insignificance in the geopolitical arena.
As a result, Canadian governments target their foreign policy largely at the small domestic constituency that actually cares about it. Naturally, this group disproportionately inhabits the academic and media worlds, where it clings, like its American and European counterparts, to a familiar breed of woolly-minded leftish anti-Americanism with the dogmatic uniformity typical of small, concentrated intellectual groups. Canadian journalists and academics are also somewhat self-selected for anti-Americanism, since the most successful among them usually have the option of enhancing their prestige and paychecks south of the border--an option many of them exercise, unless they are strongly disinclined to do so--and the less successful thus have ample cause for "sour grapes" resentment of an American cultural and intellectual pre-eminence that excludes them.
There is also a strain of anti-Americanism that runs through most segments of Canadian society, and that has little to justify it beyond common "us vs. them" home-team-rooting. It's not particularly intense or virulent, though, and it's counterbalanced by Canadians' general sense of neighborly good feeling towards folks south of the border. (A large fraction, after all, have friends or relatives in the US, visit often, and are deeply immersed in popular culture. Pernicious stereotypes about American national characteristics are hard to sustain under those conditions; one has to live among Americans for years, as I have, to develop them.) But on issues that don't really matter (and let's face it: what Canadian politicians have to say about world affairs almost never really matters), playing to anti-American peevishness rarely causes a politician lasting damage.
On matters of substance--i.e., action--though, I believe that a solid majority of Canadians invariably stand firmly with their American allies. They helped house stranded American travelers on September 11th, when American flights were grounded; their soldiers joined the US in the Afghanistan campaign; and they continue to cooperate with their neighbors on a variety of continent-wide security matters. The longest undefended border in the world will no doubt remain undefended--and friendly--for a long time to come.
The first thing that Americans should understand about Canadian politics is that Canadians are, by and large, a politically uncommitted bunch. Only a tiny fraction of the population belongs to a political party, and most of the rest are happy to vote for whichever party seems to be addressing the pocketbook issues of the day (or to be winning handily enough to be worth currying favor with, in the hopes of receiving a greater share of federal pork-barrel spending after the election). To the extent that there is any mass partisanship in federal politics, it is largely a matter of inter-regional conflict, with parties increasingly representing their regional power bases. Canadian foreign policy is simply not on the political radar screen, as Canadian voters understand perfectly well their country's utter insignificance in the geopolitical arena.
As a result, Canadian governments target their foreign policy largely at the small domestic constituency that actually cares about it. Naturally, this group disproportionately inhabits the academic and media worlds, where it clings, like its American and European counterparts, to a familiar breed of woolly-minded leftish anti-Americanism with the dogmatic uniformity typical of small, concentrated intellectual groups. Canadian journalists and academics are also somewhat self-selected for anti-Americanism, since the most successful among them usually have the option of enhancing their prestige and paychecks south of the border--an option many of them exercise, unless they are strongly disinclined to do so--and the less successful thus have ample cause for "sour grapes" resentment of an American cultural and intellectual pre-eminence that excludes them.
There is also a strain of anti-Americanism that runs through most segments of Canadian society, and that has little to justify it beyond common "us vs. them" home-team-rooting. It's not particularly intense or virulent, though, and it's counterbalanced by Canadians' general sense of neighborly good feeling towards folks south of the border. (A large fraction, after all, have friends or relatives in the US, visit often, and are deeply immersed in popular culture. Pernicious stereotypes about American national characteristics are hard to sustain under those conditions; one has to live among Americans for years, as I have, to develop them.) But on issues that don't really matter (and let's face it: what Canadian politicians have to say about world affairs almost never really matters), playing to anti-American peevishness rarely causes a politician lasting damage.
On matters of substance--i.e., action--though, I believe that a solid majority of Canadians invariably stand firmly with their American allies. They helped house stranded American travelers on September 11th, when American flights were grounded; their soldiers joined the US in the Afghanistan campaign; and they continue to cooperate with their neighbors on a variety of continent-wide security matters. The longest undefended border in the world will no doubt remain undefended--and friendly--for a long time to come.
Sunday, November 10, 2002
Thomas Friedman's at it again. His latest column says a lot of really silly things--such as (I'm not making this up) that the "Bush hard-liners" who hope to invade Iraq and topple Saddam Hussein don't "really want to invest in making the world a different place, or....have any imagination or inspiration to do so". (Wiser souls, he explains, appreciate the far greater world-changing power of--I'm not kidding--"diplomacy".) But his introductory paragraph repeats a canard whose absurdity will be apparent, sadly, to all too few readers. He cites, with approval, a "senior European diplomat" who complained that the Bush administration is failing to tell Israel that it "needs to find a secure way to get out of the settlements."
Now, it may well be that an Israeli-Palestinian peace agreement will one day be signed, one clause of which involves the evacuation of most or all of the Jews living in the West Bank and Gaza. (I'm skeptical--it seems unlikely that Palestinians would be willing to live at peace with a Jewish state a few miles away, but not with a Jewish neighborhood on the next hill--but I suppose it's still possible.) However, it doesn't make a lot of sense to start talking to the Israelis about dismantling settlements in the occupied territories when they're currently militarily occupying almost all the West Bank's major cities. As for the Palestinians, they conspicuously do not refer to the violence that first erupted in September 2001 as the "Settlements Intifada", were not responding at the time--even as a pretext--to a visit by Ariel Sharon to a settlement, never refer to removal of the settlements as their primary goal, and do not discriminate between civilians living in Israel proper and those in the territories when executing their terrorist attacks. Nor is there any indication that the failure of the Camp David and Taba negotiations hinged in any significant way on the settlements--an issue on which the Israelis were in fact quite flexible, to no avail.
The real importance of the settlements, though, lies in the role they play in Friedman's worldview and that of his European diplomatic friends. For them, it is crucial that they find something that Israel must be cajoled into conceding--otherwise, their negotiations-based strategy is self-evidently doomed to fecklessness, given the Palestinians' refusal to bow even to harsh Israeli military pressure, let alone to mere diplomatic pestering. But what Israeli concession can they possibly portray as a key goal for the diplomats? Military restraint is a non-starter, from Israel's point of view, since it's been repeatedly and amply proven to be of no use whatsoever in winning Palestinian reciprocity. Likewise, most of the generous long-term offers spurned at Camp David, such as full statehood and compromise on Jerusalem, have lost all their plausibility as bargaining chips in the eyes of Israelis.
Settlement-dismantling, on the other hand, is not a completely quixotic goal; it still retains a modest constituency within the Israeli body politic, mostly for various internal political reasons. Hence, if Friedman et al. can (mis-) represent settlement evacuation as the potential breakthrough step in a process of mutual compromise, then--voila!-- they can claim a vital role for diplomacy in resolving the conflict. It's a slender reed, to be sure; but it's the only one available, and without it, they would literally have no justification, however feeble, for trying to insinuate a diplomatic component into Israel's muscular (and comparatively far more effective) response to terrorism.
Now, it may well be that an Israeli-Palestinian peace agreement will one day be signed, one clause of which involves the evacuation of most or all of the Jews living in the West Bank and Gaza. (I'm skeptical--it seems unlikely that Palestinians would be willing to live at peace with a Jewish state a few miles away, but not with a Jewish neighborhood on the next hill--but I suppose it's still possible.) However, it doesn't make a lot of sense to start talking to the Israelis about dismantling settlements in the occupied territories when they're currently militarily occupying almost all the West Bank's major cities. As for the Palestinians, they conspicuously do not refer to the violence that first erupted in September 2001 as the "Settlements Intifada", were not responding at the time--even as a pretext--to a visit by Ariel Sharon to a settlement, never refer to removal of the settlements as their primary goal, and do not discriminate between civilians living in Israel proper and those in the territories when executing their terrorist attacks. Nor is there any indication that the failure of the Camp David and Taba negotiations hinged in any significant way on the settlements--an issue on which the Israelis were in fact quite flexible, to no avail.
The real importance of the settlements, though, lies in the role they play in Friedman's worldview and that of his European diplomatic friends. For them, it is crucial that they find something that Israel must be cajoled into conceding--otherwise, their negotiations-based strategy is self-evidently doomed to fecklessness, given the Palestinians' refusal to bow even to harsh Israeli military pressure, let alone to mere diplomatic pestering. But what Israeli concession can they possibly portray as a key goal for the diplomats? Military restraint is a non-starter, from Israel's point of view, since it's been repeatedly and amply proven to be of no use whatsoever in winning Palestinian reciprocity. Likewise, most of the generous long-term offers spurned at Camp David, such as full statehood and compromise on Jerusalem, have lost all their plausibility as bargaining chips in the eyes of Israelis.
Settlement-dismantling, on the other hand, is not a completely quixotic goal; it still retains a modest constituency within the Israeli body politic, mostly for various internal political reasons. Hence, if Friedman et al. can (mis-) represent settlement evacuation as the potential breakthrough step in a process of mutual compromise, then--voila!-- they can claim a vital role for diplomacy in resolving the conflict. It's a slender reed, to be sure; but it's the only one available, and without it, they would literally have no justification, however feeble, for trying to insinuate a diplomatic component into Israel's muscular (and comparatively far more effective) response to terrorism.
Monday, November 04, 2002
A few months ago, Slate somehow managed to cajole the unlucky Virginia Heffernan into subjecting herself to an episode of HBO's "Real Sex" television series and then reporting her impressions to readers. Her staggering conclusion: the program's "soft sociology provide[s] an excuse to look at soft pornography". The article must have drawn a high hit count, though, because now they've sent Emily Nussbaum to check out New York's Museum of Sex. While Ms. Nussbaum is actually fairly upbeat, describing one exhibit as "an impressive combination of titillating and educational", she can't quite disguise the museum's real target demographic, admitting that the guards "have to shoo people away" from the hardcore stuff. Odds are that most of the patrons "transformed....into zombies" by the "lurid close-up genital pistons of a Dolores Del Rio porn film" don't share Nussbaum's gender.
Why, then, did Slate twice send a woman to do a man's job? Perhaps because an honest male assessment of this ever-so-slightly-dressed-up smut would have to be brutally frank about its producers' obvious goals in presenting it--and hence, by implication, about Slate's obvious goals in reviewing it.
Why, then, did Slate twice send a woman to do a man's job? Perhaps because an honest male assessment of this ever-so-slightly-dressed-up smut would have to be brutally frank about its producers' obvious goals in presenting it--and hence, by implication, about Slate's obvious goals in reviewing it.
Sunday, November 03, 2002
Lisa Dusseault tells an amusing story about an encounter with Canadian tourists in San Francisco who, on hearing that she is an expatriate Canadian living in the US, commiserate with her plight. I understand her nonplussed reaction; I must at least partially disagree, however, with her claim that "it's not so different" living in the US, as opposed to Canada.
In one sense, she's clearly correct: many of the horror stories that Canadians tell each other (and themselves) about America are built around serious misunderstandings of their southern neighbor. I can say this with some authority, since I labored under several such illusions myself while growing up in Canada, and was only thoroughly disabused of them when I moved to the US. For example, Canadians hear about the horrific American crime rate, and assume that life in America is a daily crap-shoot for survival. In fact, the means to insulate oneself from crime--peaceful, safe suburban neighborhoods and well-protected shopping areas, workplaces and recreational districts--are readily available (and affordablly accessible) to a very large fraction of Americans. Like Lisa, I have never been a crime victim in the US, and I certainly haven't had to spend a fortune to buy my safety, as Canadian myths would suggest.
Similarly, health care was once the canonical example of the contrast between terrifying American chaos and reassuring Canadian orderliness. Again, though, most Americans have employer-provided healthcare that provides a level of protection comparable to the standard Canadian regime (indeed, arguably superior to it, given the tales I've heard lately about the collapsing Canadian health care system). And US politics, for all its faults, is much harder to criticize these days in light of the appalling way that Canadian officials have embarrassed themselves when commenting on American foreign policy.
And then there are the attractions of American life. For example, the US has a service ethic far superior to Canada's; being a customer of any kind in America is a real joy compared with Canada's more, uh, European approach to customer care. And, as Lisa points out, economic opportunity can also be considered a quality-of-life issue: a more enjoyable, interesting, challenging job represents a lifestyle improvement above and beyond any material standard-of-living increase it may provide.
And yet...there are real cultural differences that can make the adjustment to American life difficult for a born-and-bred Canadian, even after the misconceptions have been discounted. For example, I find American interpersonal culture to be marked by a peculiar level of unabashed self-centeredness and self-indulgence--a kind of naive thoughtlessness about the feelings and concerns of others when it interferes with one's own "pursuit of happiness". I'm not making a political statement here, or alluding to any grand philosophical principle; rather, I'm speaking of the very fabric of day-to-day American social interaction. I once had lunch in a restaurant in California with a group that included a German friend; when this friend needed to squeeze behind another diner's chair at the next table in order to leave the restaurant, the other diner, rather than shift his chair to allow my friend to leave, continued his conversation for several minutes, happily oblivious to the buttocks mere inches from the back of his head. Eventually my polite friend was forced to bring his problem explicitly to the gentleman's attention, at which point he was happy to assist by shifting his chair forward slightly. "Only in America", muttered my friend after we had left. I believe he's right; many other cultures tolerate behavior that North Americans might consider deliberately rude, but only in the US is it unsurprising that an otherwise non-hostile person would so egregiously fail, in all innocence, to take others' concerns into consideration at all. To someone raised on diffident Canadian politeness, the adjustment to this American-style solipsism can be difficult.
There are other differences, as well. America is a much more class-conscious society than Canada, in which people are keenly aware of markers of social status. (I can't imagine a Canadian, for instance, conspicuously dropping blatant, smug references to his or her alma mater, the way many ivy league-educated Americans seem to--sometimes literally within minutes of meeting me.) On the plus side, the level of diligence, industry and entrepreneurialism in the US far outstrips the Canadian norm. (The experience of shame at one's own laziness also requires some adjustment, as it turns out.)
None of these differences is in itself particularly taxing to deal with, of course; nor do they, taken together, justify receiving condolences from visiting fellow Canadians. But they do cause me to miss, on occasion, the country of my birth--and to experience a certain feeling of warm comfort on each return visit to the country I still think of, in a way, as home.
In one sense, she's clearly correct: many of the horror stories that Canadians tell each other (and themselves) about America are built around serious misunderstandings of their southern neighbor. I can say this with some authority, since I labored under several such illusions myself while growing up in Canada, and was only thoroughly disabused of them when I moved to the US. For example, Canadians hear about the horrific American crime rate, and assume that life in America is a daily crap-shoot for survival. In fact, the means to insulate oneself from crime--peaceful, safe suburban neighborhoods and well-protected shopping areas, workplaces and recreational districts--are readily available (and affordablly accessible) to a very large fraction of Americans. Like Lisa, I have never been a crime victim in the US, and I certainly haven't had to spend a fortune to buy my safety, as Canadian myths would suggest.
Similarly, health care was once the canonical example of the contrast between terrifying American chaos and reassuring Canadian orderliness. Again, though, most Americans have employer-provided healthcare that provides a level of protection comparable to the standard Canadian regime (indeed, arguably superior to it, given the tales I've heard lately about the collapsing Canadian health care system). And US politics, for all its faults, is much harder to criticize these days in light of the appalling way that Canadian officials have embarrassed themselves when commenting on American foreign policy.
And then there are the attractions of American life. For example, the US has a service ethic far superior to Canada's; being a customer of any kind in America is a real joy compared with Canada's more, uh, European approach to customer care. And, as Lisa points out, economic opportunity can also be considered a quality-of-life issue: a more enjoyable, interesting, challenging job represents a lifestyle improvement above and beyond any material standard-of-living increase it may provide.
And yet...there are real cultural differences that can make the adjustment to American life difficult for a born-and-bred Canadian, even after the misconceptions have been discounted. For example, I find American interpersonal culture to be marked by a peculiar level of unabashed self-centeredness and self-indulgence--a kind of naive thoughtlessness about the feelings and concerns of others when it interferes with one's own "pursuit of happiness". I'm not making a political statement here, or alluding to any grand philosophical principle; rather, I'm speaking of the very fabric of day-to-day American social interaction. I once had lunch in a restaurant in California with a group that included a German friend; when this friend needed to squeeze behind another diner's chair at the next table in order to leave the restaurant, the other diner, rather than shift his chair to allow my friend to leave, continued his conversation for several minutes, happily oblivious to the buttocks mere inches from the back of his head. Eventually my polite friend was forced to bring his problem explicitly to the gentleman's attention, at which point he was happy to assist by shifting his chair forward slightly. "Only in America", muttered my friend after we had left. I believe he's right; many other cultures tolerate behavior that North Americans might consider deliberately rude, but only in the US is it unsurprising that an otherwise non-hostile person would so egregiously fail, in all innocence, to take others' concerns into consideration at all. To someone raised on diffident Canadian politeness, the adjustment to this American-style solipsism can be difficult.
There are other differences, as well. America is a much more class-conscious society than Canada, in which people are keenly aware of markers of social status. (I can't imagine a Canadian, for instance, conspicuously dropping blatant, smug references to his or her alma mater, the way many ivy league-educated Americans seem to--sometimes literally within minutes of meeting me.) On the plus side, the level of diligence, industry and entrepreneurialism in the US far outstrips the Canadian norm. (The experience of shame at one's own laziness also requires some adjustment, as it turns out.)
None of these differences is in itself particularly taxing to deal with, of course; nor do they, taken together, justify receiving condolences from visiting fellow Canadians. But they do cause me to miss, on occasion, the country of my birth--and to experience a certain feeling of warm comfort on each return visit to the country I still think of, in a way, as home.
Friday, November 01, 2002
Mark Kleiman and Eugene Volokh are apparently both of the opinion that the Boy Scouts' policy of excluding atheists amounts to a kind of religious discrimination. (Both concede the Scouts' Constitutional right to their policy, but consider it morally wrong nonetheless.) As Volokh puts it, "[i]f the Scouts excluded Catholics -- everyone else, Jewish, Protestant, or what have you is fine, but not Catholics -- we'd rightly condemn them, even if they said 'Rejection of Catholicism is one of our core beliefs.' Likewise, I think, when they exclude atheists." (Kleiman makes the same point, right down to the choice of analogy.)
I rather doubt that the Scouts' opposition to atheism is as narrowly defined in practice as Kleiman and Volokh claim; would the Scouts accept, for example, a Satanist troop, or one that worships only a particular (living) charismatic cult leader? If the Scouts turn out to have bona fide doctrinal standards compatible with most religions but exclusive of a few, then they would be no different from a group that rejected, say, believers in performing child sacrifice rituals or murdering all heretics (except, of course, in the sense that Kleiman and Volokh probably find adherents of such ideas far more worthy of exclusion than those, including atheists, who happen to offend the Scouts' somewhat woolier religious principles).
I also wonder whether the two professors have paused to consider in just what company they have placed themselves with their choice of analogy. The most conspicuous advocates of the idea that atheism is a religious conviction--comparable to, say, Catholicism--are fundamentalist Christians attempting to inject "Creationism" into the public-school curriculum. After all, if atheism is a religion, just like literalist Christianity, then it's perfectly valid to claim that Darwin's theory of evolution is as much a religious position as is the "Genesis theory" of human origins. Likewise, if absence of religion is just another religious creed, then school voucher programs that encompass confessional schools are not only Constitutionally permissible--they might even be mandatory, under the judiciary's current broad reading of the First Amendment, to prevent the government from "establishing" atheism, over all other religious doctrines, as the official "faith" of the public school system.
But atheism is not a religion; the absence of religion is very different from the presence of one. Teaching evolutionary biology in science class, and rejecting all of its religious alternatives, is not the same as teaching a single religious alternative. Forbidding the promulgation of any religion in public schools is nothing at all like exclusively promulgating a single one. And likewise the Boy Scouts, in requiring their members to affiliate with a religion--any religion--are not "excluding" a particular religion. Kleiman and Volokh should be happy about that; the consequences of atheism being designated by convention as just another religious belief would be most unlikely to please either of them.
I rather doubt that the Scouts' opposition to atheism is as narrowly defined in practice as Kleiman and Volokh claim; would the Scouts accept, for example, a Satanist troop, or one that worships only a particular (living) charismatic cult leader? If the Scouts turn out to have bona fide doctrinal standards compatible with most religions but exclusive of a few, then they would be no different from a group that rejected, say, believers in performing child sacrifice rituals or murdering all heretics (except, of course, in the sense that Kleiman and Volokh probably find adherents of such ideas far more worthy of exclusion than those, including atheists, who happen to offend the Scouts' somewhat woolier religious principles).
I also wonder whether the two professors have paused to consider in just what company they have placed themselves with their choice of analogy. The most conspicuous advocates of the idea that atheism is a religious conviction--comparable to, say, Catholicism--are fundamentalist Christians attempting to inject "Creationism" into the public-school curriculum. After all, if atheism is a religion, just like literalist Christianity, then it's perfectly valid to claim that Darwin's theory of evolution is as much a religious position as is the "Genesis theory" of human origins. Likewise, if absence of religion is just another religious creed, then school voucher programs that encompass confessional schools are not only Constitutionally permissible--they might even be mandatory, under the judiciary's current broad reading of the First Amendment, to prevent the government from "establishing" atheism, over all other religious doctrines, as the official "faith" of the public school system.
But atheism is not a religion; the absence of religion is very different from the presence of one. Teaching evolutionary biology in science class, and rejecting all of its religious alternatives, is not the same as teaching a single religious alternative. Forbidding the promulgation of any religion in public schools is nothing at all like exclusively promulgating a single one. And likewise the Boy Scouts, in requiring their members to affiliate with a religion--any religion--are not "excluding" a particular religion. Kleiman and Volokh should be happy about that; the consequences of atheism being designated by convention as just another religious belief would be most unlikely to please either of them.
Subscribe to:
Posts (Atom)