Ariel Sharon has proposed a new "disengagement plan" in his recent speech at a conference in Herzliya. The plan calls for Israeli forces to withdraw unilaterally from the West Bank, behind a perimeter marked by the "security fence" currently under construction. (The fence mostly follows the 1948 Israeli-Jordanian border, but juts out in places well into the West Bank.) The plan also calls for some settlement outposts beyond this perimeter to be dismantled.
The plan has drawn mixed reviews. about 60% of Israelis support it, but many on the left doubt that it will actually be implemented. Meanwhile, the right complains that it represents a victory for terrorists, and a defeat for settlers.
I have a more mundane objection to the plan: it's doomed to fail. The standard argument for withdrawing behind a security fence is that it has succeeded in Gaza, since no suicide bombers have launched attacks on Israel from there. But it's also the case that from Gaza--where the IDF never launched a full-fledged invasion of all the cities, as it did in the West Bank--there has been a steady rain of crude rockets on adjacent Israeli lands, such as the Gaza settlements and the town of Sderot (inside pre-1967 Israel). If the IDF were to "disengage" from the West Bank, it might well be able to prevent suicide bombers from penetrating its security perimiter. But it would be incapable of preventing massive barrages of rockets on towns near the perimiter--including several major Israeli cities.
What would Israel do in such a situation? if it returned to the West Bank in force, it would be no better off than it is now--indeed, worse off, since it had afforded the terrorist organizations time and space to regroup and rejuvenate themselves. If it restrained itself from re-entering the West Bank, it would have to respond from a distance, using weapons, such as artillery and helicopters, that are much more likely to cause collateral civilian damage, and thus earn condemnation from world opinion. And it would run the risk of international "observers" or "peacekeepers" entering the territories in Israel's absence, and turning into de facto defenders of the terrorists' turf.
The security fence is clearly a necessary measure for the protection of Israelis. And the Israeli far right's complaints notwithstanding, there's nothing wrong with the removal of a few settlement outposts, if it improves Israel's security situation. But abandoning the West Bank to the mercy of Arafat's goons, Hamas, Islamic Jihad and the rest is good neither for its inhabitants nor for their Israeli neighbors. Israeli politicians should stop treating this option as if it were a remotely viable one.
Sunday, December 28, 2003
Thursday, December 25, 2003
In early December, the New York Times published an ominous-sounding account of the harsh measures being taken by US troops to quell guerrilla forces still operating in occupied Iraq:
Well, I didn't expect such quick vindication. According to the Washington Post,
As the guerrilla war against Iraqi insurgents intensifies, American soldiers have begun wrapping entire villages in barbed wire. In selective cases, American soldiers are demolishing buildings thought to be used by Iraqi attackers. They have begun imprisoning the relatives of suspected guerrillas, in hopes of pressing the insurgents to turn themselves in. …Both Eric Rescorla and Crooked Timber's Henry Farrell noted the new American tactics with great consternation. I offered comments to each, based on my previous arguments, suggesting that their concern was misplaced.
“If you have one of these cards, you can come and go,” coaxed Lt. Col. Nathan Sassaman, the battalion commander whose men oversee the village, about 50 miles north of Baghdad. “If you don’t have one of these cards, you can’t.” The Iraqis nodded and edged their cars through the line. Over to one side, an Iraqi man named Tariq muttered in anger. “I see no difference between us and the Palestinians,” he said. “We didn’t expect anything like this after Saddam fell.”
Well, I didn't expect such quick vindication. According to the Washington Post,
At the heart of this tightly woven network is Auja, Hussein's birthplace, which U.S. commanders say is the intelligence and communications hub of the insurgency....U.S. commanders said they dealt the insurgents a major blow when they decided Oct. 30 to isolate Auja, surrounding it with fence and razor wire so the sole exit was past a U.S. military checkpoint. Russell said this move severed the insurgency's intelligence and communications hub from the outside campaign.Of course, this latter account could be exaggerated, or even entirely misguided. But so far, at least, my analysis appears to be holding up pretty well.
Sunday, December 14, 2003
If I were an advertising executive for Remington or Norelco, I'd rush to produce a last-minute holiday ad campaign based on this picture.
Thursday, December 11, 2003
In a pre-dawn sweep, German police raided over 1,200 homes of suspected followers of a Turkish Islamic radical named Muhammed Metin Kaplan.
Thousands of confused German students took to the streets in protest, worried about losing access to their SAT prep courses.
Thousands of confused German students took to the streets in protest, worried about losing access to their SAT prep courses.
Wednesday, December 10, 2003
The US Supreme Court has ruled that Congress can restrict "soft money" contributions to political parties and restrict purchases of political advocacy advertising in the period immediately preceding an election. Meanwhile, the US Ninth Circuit Court of Appeals has declared that contributions to terrorist organizations are constitutionally protected, unless it has been proven beyond a reasonable doubt that the contributor knew of the recipient organization's terrorist activity.
The solution for campaign organizations is obvious: they should take their cue from foreign terrorist groups and establish a clandestine network of interlocking "charitable" and "political" organizations, with money being secretly funnelled from the former to the latter via elaborate money-laundering enterprises, preferably in foreign countries. A corporation can hardly be held responsible, after all, if its donation to the "World Regulation Relief Fund" results in an attack ad against a local congressman being purchased by a tangentially related foreign political organization--right?
I can think of no better way to clean up the financing of American campaigns. Thank goodness for the uncanny wisdom of the federal judiciary.
The solution for campaign organizations is obvious: they should take their cue from foreign terrorist groups and establish a clandestine network of interlocking "charitable" and "political" organizations, with money being secretly funnelled from the former to the latter via elaborate money-laundering enterprises, preferably in foreign countries. A corporation can hardly be held responsible, after all, if its donation to the "World Regulation Relief Fund" results in an attack ad against a local congressman being purchased by a tangentially related foreign political organization--right?
I can think of no better way to clean up the financing of American campaigns. Thank goodness for the uncanny wisdom of the federal judiciary.
Monday, December 01, 2003
In the Washington Post, Anne Applebaum mourns the lack of a "national debate" on the recently-passed Medicare reform bill. "[W]e as a nation have lost our appetite for grand domestic policy debates," she laments.
These sound like admirably democratic sentiments--until one considers who she meant by "we". Surely she didn't expect the average family to debate the finer points of Medicare reform over the Thanksgiving dinner table. Heck, the likes of Matthew Yglesias, Daniel Drezner and Oxblog's David Adesnik--none of them exactly strangers to policy wonkery--have all conceded that the subject of Medicare is one truly powerful snooze-inducer. Perhaps Anne Applebaum has a stronger stomach for it, but if so, she's in rare company.
In fact, the whole idea of "deliberative democracy" is rather dubious to begin with. After all, if the public ought to be ruminating on the minutiae of every issue, then what on earth is representative government for, anyway? Even if the public were competent to engage in such niggling discussion, they would have neither the time nor the inclination to do so, being far too preoccupied with their own lives to dwell, Applebaum-style, on such matters.
In practice, the system works quite well with much less public involvement. Voters get riled up over a few specific, fairly straightforward issues that they can grasp--Applebaum mentions two recent ones: telemarketing and spam--and happily leave the details of the rest to the political insiders, with the understanding that if they're handled sufficiently incompetently that they become visible problems, then politicians' heads will roll.
After all, that's how CEOs deal with underlings, or customers with merchants. All that they require is accountability, not micromanagement, and the threat to do business with someone else usually (eventually) suffices to produce competent results. If I trust the food I eat, the home I live in, and many other necessities of life to this system, then why not the laws that govern me, as well?
These sound like admirably democratic sentiments--until one considers who she meant by "we". Surely she didn't expect the average family to debate the finer points of Medicare reform over the Thanksgiving dinner table. Heck, the likes of Matthew Yglesias, Daniel Drezner and Oxblog's David Adesnik--none of them exactly strangers to policy wonkery--have all conceded that the subject of Medicare is one truly powerful snooze-inducer. Perhaps Anne Applebaum has a stronger stomach for it, but if so, she's in rare company.
In fact, the whole idea of "deliberative democracy" is rather dubious to begin with. After all, if the public ought to be ruminating on the minutiae of every issue, then what on earth is representative government for, anyway? Even if the public were competent to engage in such niggling discussion, they would have neither the time nor the inclination to do so, being far too preoccupied with their own lives to dwell, Applebaum-style, on such matters.
In practice, the system works quite well with much less public involvement. Voters get riled up over a few specific, fairly straightforward issues that they can grasp--Applebaum mentions two recent ones: telemarketing and spam--and happily leave the details of the rest to the political insiders, with the understanding that if they're handled sufficiently incompetently that they become visible problems, then politicians' heads will roll.
After all, that's how CEOs deal with underlings, or customers with merchants. All that they require is accountability, not micromanagement, and the threat to do business with someone else usually (eventually) suffices to produce competent results. If I trust the food I eat, the home I live in, and many other necessities of life to this system, then why not the laws that govern me, as well?
Wednesday, November 26, 2003
Matthew Yglesias, the sole American (as far as I know) who actually pays attention to Canadian politics, notes that a movement is afoot there to introduce proportional representation (PR) into the Canadian electoral system. This development is a source of some amusement to me, for reasons of personal experience. I spent a little time in Israel in 1990--a year that does not exactly stand out as a high point in the annals of Israeli politics. The electorate at the time was very nearly equally divided between the two major party coalitions, and both began resorting to a series of increasingly dubious maneuvers in an effort to establish a bare majority in the country's parliament.
During my stay, I heard numerous friends explain to me that the underlying cause of the problem was Israel's system of proportional representation. PR, they explained, removes all accountability from individual members of parliament, who win their seats based on their positions on their party's list, rather than on any direct support from voters. It thus rewards craven party hacks over politicians of broad stature whom the party leader might consider a threat. It also gives considerable power to small, unaffiliated parties, who can then sell their loyalty to one or the other major coalition at an arbitrarily large and unseemly price. What was needed, my friends argued, was a system of parliamentary districts--like the Canadian one.
Of course, I was familiar with the Canadian reality, and I patiently explained to my friends that in fact Canadian members of parliament are about as likely to be craven party hacks as their Israeli counterparts. Because party leaders determine who will fill the executive, cabinet and top civil service posts should their party form a government, they completely control the levers of power, and voters thus almost always vote based on party leadership rather than the identity of the local back-bench candidate. Backbenchers thus owe their election prospects--and their hopes of rising in the party leadership--entirely to their status in the party.
As for the mercenary aspect of PR, it's simply replaced by the old-fashioned pork barrel. Many ridings have a tradition of loyalty to one party or another--or even to whichever party appears about to win--and milk that loyalty for lucrative government handouts. The only difference between this form of bribery and the kind experienced in Israel is that the constituencies being bribed in Canada are geographical rather than broadly political.
Finally, district-based voting--at least with a "first past the post" voting system--has the drawback of tending to exaggerate the mandate of the most popular party, often giving an absolute majority in parliament to a party with perhaps a third or so of the popular vote. Of course, this setup has its advantages as well: with majority governments more likely, horse-trading to form coalitions is much less common. But those majority governments also have much greater political power than their popular vote would suggest they deserve, and thoughtful Canadians have thus long believed that PR would be a fairer, more democratic system.
In fact, it seems to be a popular conceit in just about every democracy that problems such as pork-barrel politics, cynical political horse-trading, and party hackery represent subtle flaws in the system's plumbing, and that a few careful adjustments to its gaskets and stopcocks can simply make the ugliness go away. In reality, just about any reasonable system will soon bring the voting public the government they implicitly want (and richly deserve), through the straightforward process of politicians being punished for displeasing the electorate. If Israelis--or Canadians--really didn't want politicians haggling for goodies for their constituents, then they could refuse to vote for politicians that did so. If Canadians--or Israelis--really didn't want to be represented in parliament by party hacks, then they could refuse to vote for parties that nominated them to stand for parliament.
Somebody, though, is voting for these people--enough, in fact, that they're still getting elected. 'Nuff said.
During my stay, I heard numerous friends explain to me that the underlying cause of the problem was Israel's system of proportional representation. PR, they explained, removes all accountability from individual members of parliament, who win their seats based on their positions on their party's list, rather than on any direct support from voters. It thus rewards craven party hacks over politicians of broad stature whom the party leader might consider a threat. It also gives considerable power to small, unaffiliated parties, who can then sell their loyalty to one or the other major coalition at an arbitrarily large and unseemly price. What was needed, my friends argued, was a system of parliamentary districts--like the Canadian one.
Of course, I was familiar with the Canadian reality, and I patiently explained to my friends that in fact Canadian members of parliament are about as likely to be craven party hacks as their Israeli counterparts. Because party leaders determine who will fill the executive, cabinet and top civil service posts should their party form a government, they completely control the levers of power, and voters thus almost always vote based on party leadership rather than the identity of the local back-bench candidate. Backbenchers thus owe their election prospects--and their hopes of rising in the party leadership--entirely to their status in the party.
As for the mercenary aspect of PR, it's simply replaced by the old-fashioned pork barrel. Many ridings have a tradition of loyalty to one party or another--or even to whichever party appears about to win--and milk that loyalty for lucrative government handouts. The only difference between this form of bribery and the kind experienced in Israel is that the constituencies being bribed in Canada are geographical rather than broadly political.
Finally, district-based voting--at least with a "first past the post" voting system--has the drawback of tending to exaggerate the mandate of the most popular party, often giving an absolute majority in parliament to a party with perhaps a third or so of the popular vote. Of course, this setup has its advantages as well: with majority governments more likely, horse-trading to form coalitions is much less common. But those majority governments also have much greater political power than their popular vote would suggest they deserve, and thoughtful Canadians have thus long believed that PR would be a fairer, more democratic system.
In fact, it seems to be a popular conceit in just about every democracy that problems such as pork-barrel politics, cynical political horse-trading, and party hackery represent subtle flaws in the system's plumbing, and that a few careful adjustments to its gaskets and stopcocks can simply make the ugliness go away. In reality, just about any reasonable system will soon bring the voting public the government they implicitly want (and richly deserve), through the straightforward process of politicians being punished for displeasing the electorate. If Israelis--or Canadians--really didn't want politicians haggling for goodies for their constituents, then they could refuse to vote for politicians that did so. If Canadians--or Israelis--really didn't want to be represented in parliament by party hacks, then they could refuse to vote for parties that nominated them to stand for parliament.
Somebody, though, is voting for these people--enough, in fact, that they're still getting elected. 'Nuff said.
Monday, November 24, 2003
Journalists have been on a bit of a thumbsucking tear since Stephen Glass, author of numerous wholly fabricated stories at the New Republic, recently appeared at a panel on journalistic ethics, following the opening of the film "Shattered Glass", based on his "career". Reaction to his return to the public eye has been, to say the least, less than effusive. Andrew Sullivan excoriated Glass' self-serving new pseudo-apologetic public demeanor. Jack Shafer spent a couple of articles pondering just what kind of brutal tortures might be fitting punishments for his crimes. Jonathan Chait, ever the intellectual, worried that his case might distract people from the real problem with journalism: that not enough writers see the world exactly the way Jonathan Chait does. Other scribes speculated on the nature of Glass' psychopathology, or applauded the film's brisk pacing and sharp characterizations.
None of these recent commentaries, however, address the main journalistic theme of the film: that Glass' perfidy succeeded because his colorful, zany, almost comic-novelistic stories seduced readers, colleagues and editors alike--and that all of them should have been paying more attention to the articles' intellectual content (let alone their basic accuracy) than to their flashiness. Glass' hilarious touches--the orgy at the Young Republicans' convention, the workoholic stockbroker who relieved himself into a medical contraption to avoid having to leave his desk, the teenaged computer hacker with his own agent for job contract negotiations--were (as Shafer pointed out five years ago) "too good to check", not because they confirmed strong real-life suspicions--or even because they catered to wishful expectations or simplistic prejudices--but rather because their energy and vividness, their perfect balance on the edge between plausibility and wild absurdity, made them fascinating to read whether they were true or not.
Perhaps the reason why this point hasn't been adequately addressed in all the journalistic commentary on the Glass story is that journalists themselves, far from recognizing this problem, continue to admire brilliant Glass-style scene-painting, even as they condemn the fabrications it made possible. Modern journalists worship the craftsman who can evoke settings and characters with a few deft sentences, turning an otherwise unremarkable story into irresistible reader-bait. Witness New York Times journalist Rick Bragg, who made a career out of such stories, gently (if somewhat condescendingly) sketching the rural American South and its honest, simple, hardworking folk--until he was fired, in the wake of the Jayson Blair scandal, for letting stringers do his legwork for him, and applying his magic pen to places he'd in fact barely seen. Some commentators certainly criticized his exploitation of uncredited apprentices. But none denied his writing skill, or questioned its value to his newspaper. All that was asked of him was that he 'fess up and give at least partial credit to the people who did his actual reporting for him.
Of course, in these days of wire-service and cable-channel commoditization of hard news, and (if I may say so myself) bloggers providing high-quality thought-provoking commentary and analysis for free, about the only thing journalists can market themselves with anymore--apart from their deeply ambiguous, mutually compromising relationships with leakers--is their polished writing. And so they milk their skills for all they're worth, celebrating their most accomplished storytellers--until one of them turns out to be, well, just a bit too much of a storyteller.
When Michael Kelly died while reporting from Iraq during the recent war there, he was lionized by his former colleagues--not for his legendary devotion to the truth, or his deep insight into the topics he covered, but rather (understandably) for his personal warmth and kindness and (more worryingly) for his "incandescent" writing and "unparalleled gift for editing prose". It was also mentioned in passing that Kelly's editorship at the New Republic was less than entirely successful--tactfully eliding the point that its legacy included Stephen Glass' blossoming career as a writer of powerfully compelling falsehoods. Perhaps now, when the Stephen Glass story has returned to public attention, it's time to consider the possibility that Kelly's great strength and his great failure may not have been entirely unrelated.
None of these recent commentaries, however, address the main journalistic theme of the film: that Glass' perfidy succeeded because his colorful, zany, almost comic-novelistic stories seduced readers, colleagues and editors alike--and that all of them should have been paying more attention to the articles' intellectual content (let alone their basic accuracy) than to their flashiness. Glass' hilarious touches--the orgy at the Young Republicans' convention, the workoholic stockbroker who relieved himself into a medical contraption to avoid having to leave his desk, the teenaged computer hacker with his own agent for job contract negotiations--were (as Shafer pointed out five years ago) "too good to check", not because they confirmed strong real-life suspicions--or even because they catered to wishful expectations or simplistic prejudices--but rather because their energy and vividness, their perfect balance on the edge between plausibility and wild absurdity, made them fascinating to read whether they were true or not.
Perhaps the reason why this point hasn't been adequately addressed in all the journalistic commentary on the Glass story is that journalists themselves, far from recognizing this problem, continue to admire brilliant Glass-style scene-painting, even as they condemn the fabrications it made possible. Modern journalists worship the craftsman who can evoke settings and characters with a few deft sentences, turning an otherwise unremarkable story into irresistible reader-bait. Witness New York Times journalist Rick Bragg, who made a career out of such stories, gently (if somewhat condescendingly) sketching the rural American South and its honest, simple, hardworking folk--until he was fired, in the wake of the Jayson Blair scandal, for letting stringers do his legwork for him, and applying his magic pen to places he'd in fact barely seen. Some commentators certainly criticized his exploitation of uncredited apprentices. But none denied his writing skill, or questioned its value to his newspaper. All that was asked of him was that he 'fess up and give at least partial credit to the people who did his actual reporting for him.
Of course, in these days of wire-service and cable-channel commoditization of hard news, and (if I may say so myself) bloggers providing high-quality thought-provoking commentary and analysis for free, about the only thing journalists can market themselves with anymore--apart from their deeply ambiguous, mutually compromising relationships with leakers--is their polished writing. And so they milk their skills for all they're worth, celebrating their most accomplished storytellers--until one of them turns out to be, well, just a bit too much of a storyteller.
When Michael Kelly died while reporting from Iraq during the recent war there, he was lionized by his former colleagues--not for his legendary devotion to the truth, or his deep insight into the topics he covered, but rather (understandably) for his personal warmth and kindness and (more worryingly) for his "incandescent" writing and "unparalleled gift for editing prose". It was also mentioned in passing that Kelly's editorship at the New Republic was less than entirely successful--tactfully eliding the point that its legacy included Stephen Glass' blossoming career as a writer of powerfully compelling falsehoods. Perhaps now, when the Stephen Glass story has returned to public attention, it's time to consider the possibility that Kelly's great strength and his great failure may not have been entirely unrelated.
Wednesday, November 12, 2003
New York Times columnist Nick Kristof is bemoaning the foaming-at-the-mouth hatred that dominates partisan debate in today's America. "I'm afraid that America is now transforming into.....the political moonscape that I remember when I was a student in England in the 1980's," he writes. "Left and right came from different social classes, lived in different areas, attended different schools and despised each other." Andrew Sullivan has similar thoughts.
Kristof and Sullivan should stop worrying so much. The bitter rancor that typifies the current left-right divide in the US is a cyclical phenomenon, riding on the confluence of a trio of polarizing factors that will inevitably dissipate:
Political change. The decade of the 1980's in Britain was the Margaret Thatcher era, when the dominance of welfare state politics was being challenged by the rise of a new free enterprise-oriented middle class. Like all momentous changes, that one brought the conflicting interests of rival political coalitions into stark contrast, as old compromises became irrelevant and new ones had yet to be stricken. The sharp rightward shift in American politics in the post-Clinton era has shaken things up here in similar fashion, and the scrambling over suddenly up-for-grabs assets (the fealty of the judiciary, for instance) is bound to get somewhat testy.
Momentous issues It would be nice if partisan debate over the war on terror could be polite and genteel. Unfortunately, the issue is of such obviously enormous importance that disagreements about it seem petty unless they are themselves claimed to be of deep and monumental importance. There's no point whatsoever in carping about the details of homeland security or the war in Iraq--if there's any disagreement at all, it must be asserted that it is the nation's very safety--perhaps even survival--that is at stake. Such circumstances don't exactly encourage restrained rhetoric.
A polarizing leader. it's hard to imagine a more polarizing figure than Margaret Thatcher, the "Iron Lady" who ruled her party absolutely and came to embody everything her party, faction and constituency stood for. Likewise, Bill Clinton, a president facing (after 1994) a House and Senate united against him, became, in effect, the sole major national political figure to represent his half of the electorate. And George W. Bush, because of the dominance of the presidency in the suddenly-crucial foreign policy arena (and partly because of a leadership vaccuum in Congress these days, especially on the Republican side) has found himself with an almost Clintonian level of pre-eminence when compared with any of his political allies. Needless to say, all three of these politicians were passionately adored and embraced by their "own" side--and intensely hated and excoriated by their opponents.
Of course, these three factors tend to decline with time, as political waves of change peter out, as the intensity of major crises dissipates, and as dominant leaders eventually lose their aura of invincibility and are shunted aside. As Kristof notes, "Europe has matured and become much less polarized" since his time there. There's no reason not to expect a similar outcome here, in due time.
Kristof and Sullivan should stop worrying so much. The bitter rancor that typifies the current left-right divide in the US is a cyclical phenomenon, riding on the confluence of a trio of polarizing factors that will inevitably dissipate:
Of course, these three factors tend to decline with time, as political waves of change peter out, as the intensity of major crises dissipates, and as dominant leaders eventually lose their aura of invincibility and are shunted aside. As Kristof notes, "Europe has matured and become much less polarized" since his time there. There's no reason not to expect a similar outcome here, in due time.
Thursday, November 06, 2003
Naomi Wolf apparently believes that the easy availability of Internet pornography has soured America's male youth on the idea of real, live sex with real, live women. "The onslaught of porn is responsible for deadening male libido in relation to real women," she writes. Matthew Yglesias, an instance of the demographic in question, ridicules the idea. Daniel Drezner, a former instance of same, concedes that Wolf's hypothesis may apply to some men, although not most. In fact, the statistics completely vindicate Yglesias' skepticism: sexual activity among young men has been fairly steady over the last decade--declining very slightly, but still quite high, by the standards of previous decades. If the Internet has been suppressing male libidos, the effect has yet to turn up in the data.
And that's hardly surprising. To put it bluntly, men were deriving titillation from the available imagery--or from their own imaginations--long before Naomi Wolf arrived on the scene, and without ever losing their taste for actual sexual experience. The pornography available today may be more vivid and explicit than in the past, but that doesn't mean that a modern-day Jud Fry is any more likely to be satisfied with it than with the naughty postcards of years past.
But what Wolf really objects to, it turns out, is not that boys aren't interested in sex--indeed, all the signs suggest that they still are--but rather that they don't value it as much as they used to:
Of course, Wolf has always been a strident third-wave feminist advocate of female sexual "empowerment"--i.e., women enjoying casual, recreational sex every bit as enthusiastically and uninhibitedly as men traditionally have. What she apparently never realized, until now, was that much of the fun of it for her--that is, the wonderful feeling of being valued, and appreciated, that she apparently used to be able to win with her sexual favors--depended on her being relatively unusual, among women, in her approach to sex.
But now that women like her are, in effect, a dime a dozen--and are unfortunately often treated that way, as well--she's suddenly forced to consider, to her horror, the possibility that her sisterhood's successful campaign to turn women on to "hooking up" may have destroyed the very source of the pleasure they sought to promote. No wonder she prefers instead simply to blame it all on Internet pornography....
And that's hardly surprising. To put it bluntly, men were deriving titillation from the available imagery--or from their own imaginations--long before Naomi Wolf arrived on the scene, and without ever losing their taste for actual sexual experience. The pornography available today may be more vivid and explicit than in the past, but that doesn't mean that a modern-day Jud Fry is any more likely to be satisfied with it than with the naughty postcards of years past.
But what Wolf really objects to, it turns out, is not that boys aren't interested in sex--indeed, all the signs suggest that they still are--but rather that they don't value it as much as they used to:
When I came of age in the seventies, it was still pretty cool to be able to offer a young man the actual presence of a naked, willing young woman. There were more young men who wanted to be with naked women than there were naked women on the market. If there was nothing actively alarming about you, you could get a pretty enthusiastic response by just showing up. Now....[b]eing naked is not enough; you have to be buff, be tan with no tan lines, have the surgically hoisted breasts and the Brazilian bikini wax—just like porn stars.Wolf should trust her own economic reasoning a little more: if "being naked is not enough" these days, perhaps it's simply because there are no longer "more young men who wanted to be with naked women than....naked women on the market."
Of course, Wolf has always been a strident third-wave feminist advocate of female sexual "empowerment"--i.e., women enjoying casual, recreational sex every bit as enthusiastically and uninhibitedly as men traditionally have. What she apparently never realized, until now, was that much of the fun of it for her--that is, the wonderful feeling of being valued, and appreciated, that she apparently used to be able to win with her sexual favors--depended on her being relatively unusual, among women, in her approach to sex.
But now that women like her are, in effect, a dime a dozen--and are unfortunately often treated that way, as well--she's suddenly forced to consider, to her horror, the possibility that her sisterhood's successful campaign to turn women on to "hooking up" may have destroyed the very source of the pleasure they sought to promote. No wonder she prefers instead simply to blame it all on Internet pornography....
Wednesday, November 05, 2003
Although Thomas Friedman's ideas are usually shallow and foolish, they often expose, in instructive ways, the vicissitudes of the "conventional wisdom" in American foreign policy circles. An example is his column about the differences between the Iraqi resistance to American occupation and the Vietcong's resistance to American troops during the Vietnam war. Says Friedman:
Of course, Friedman isn't really interested in the truth about the Vietcong--to him, the Vietcong are more important as characters in an abstract morality play than as a real, live historical guerrilla/terrorist organization. The storyline of the play is always the same: a ragtag collection of rebels fighting for freedom against an evil oppressor ultimately win by rallying popular support to their side, winning the people's "hearts and minds". What Friedman is saying about Iraq is not that this plot doesn't apply, but merely that the plucky rebels are the Americans and their allies, not the remaining opposition.
The problem with this script is that it's based on three false assumptions:
That the "hearts and minds" of a national population are ever united in supporting a particular political faction. I've already dealt with this fallacy.
That "hearts and minds" are won through displays of kindness, fairness and generosity. As I've mentioned before, even the nicest group of soldiers in the world quickly wears out its welcome in a foreign land, simply by being foreign and military. Political support is based on much more than just individual or group conduct.
That winning "hearts and minds" is the (only) route to victory. How long would Saddam Hussein--or any of the current governments in the Middle East, for that matter--have lasted if that premise were correct? In practice, political factions win power through a combination of public acceptance, loyalty from a core segment of supporters, and physical intimidation.
These three misconceptions, taken together, lead to a long-established pattern of Western misunderstanding of foreign civil conflicts: a faction of power-hungry fanatics gains control of some area through sheer ruthlessness, then claims to have won the "hearts and minds" of the area's cowed populace. Western observers either take this claim at face value, or else attempt to challenge it by being kinder and gentler than the fanatics--in which case, the tactic fails, and the observers conclude that the ruthless fanatics must, indeed, have won the "hearts and minds" battle. The entire West then abandons the region to suffer under the pitiless reign of the fanatics.
We can only hope that the same dynamic doesn't once again play itself out in Iraq.
The people who mounted the attacks on the Red Cross are not the Iraqi Vietcong. They are the Iraqi Khmer Rouge — a murderous band of Saddam loyalists and Al Qaeda nihilists, who are not killing us so Iraqis can rule themselves. They are killing us so they can rule Iraqis.....A vast majority of Iraqis would reject them, because these bombers either want to restore Baathism or install bin Ladenism.Now perhaps in 1970, at the height of the Vietnam war, when information sources were conflicting and difficult to assess, such a reading of history might have been excusable as a preliminary assessment. But in 2003, it is simply embarrassing to read Friedman implying that the Vietcong--a fully controlled arm of the North Vietnamese army--were somehow polar opposites to their Khmer Rouge allies, or particularly popular in the South, or fighting so that Vietnamese could "rule themselves".
Of course, Friedman isn't really interested in the truth about the Vietcong--to him, the Vietcong are more important as characters in an abstract morality play than as a real, live historical guerrilla/terrorist organization. The storyline of the play is always the same: a ragtag collection of rebels fighting for freedom against an evil oppressor ultimately win by rallying popular support to their side, winning the people's "hearts and minds". What Friedman is saying about Iraq is not that this plot doesn't apply, but merely that the plucky rebels are the Americans and their allies, not the remaining opposition.
The problem with this script is that it's based on three false assumptions:
These three misconceptions, taken together, lead to a long-established pattern of Western misunderstanding of foreign civil conflicts: a faction of power-hungry fanatics gains control of some area through sheer ruthlessness, then claims to have won the "hearts and minds" of the area's cowed populace. Western observers either take this claim at face value, or else attempt to challenge it by being kinder and gentler than the fanatics--in which case, the tactic fails, and the observers conclude that the ruthless fanatics must, indeed, have won the "hearts and minds" battle. The entire West then abandons the region to suffer under the pitiless reign of the fanatics.
We can only hope that the same dynamic doesn't once again play itself out in Iraq.
Friday, October 31, 2003
Gregg Easterbrook has once again been tripped up by his views on religion. This time, he's complaining that while physicists propose all kinds of wild, incomprehensible models of the universe--multiple hidden dimensions, for example--spirituality gets no similar respect in academic circles. "To modern thought, one extra spiritual dimension is a preposterous idea," he writes, "while the notion that there are incredible numbers of extra physical dimensions gives no pause."
Crooked Timber's Kieran Healy and Eric "Educated Guesswork" Rescorla have taken Easterbrook to task, on the perfectly reasonable grounds that string theory's idea of hidden dimensions is actually scientifically defensible, whereas Easterbrook's "spiritual dimension" is not exactly bursting with empirical or theoretical support. However, I believe they miss the more fundamental sense in which Easterbrook is off-base.
Imagine, for a moment, that we were to take Easterbrook seriously, and consider his "spiritual dimension" a valid scientific concept. What would we do? Well, we'd ask physicists to conduct experiments, work out theories, and generally explore the possibility. And let us suppose that they did so, and concluded that the "spiritual dimension" is in fact the seventh of the ten-or-eleven dimensions currently being proposed by string theorists, with its properties and behavior governed by such-and-such set of equations. Would Easterbrook be any happier?
Not at all. For the whole point of the "spiritual dimension" of which he speaks is that it's not describable in scientific terms. As he notes when discussing "intelligent design" theory (what Eric Rescorla correctly calls "warmed-over creationism"), "[w]hen it comes to intellectual rigidity, there's little difference between the national academy declaring that only natural forces may be considered, and the church declaring that only divine explanations may be considered."
Well, neither Easterbrook nor the church acknowledge the possibility that God might just be a boring old physical phenomenon, governed by a bunch of differential equations--but he likely doesn't see anything wrong with that. Such a claim would be incompatible with Christianity as just about every Christian understands it. Similarly, "intelligent design" and a "spiritual dimension" are not simply bad science, but rather non-science. Easterbrook believes there's more to the universe than science, and he's entitled to believe that. But he's not entitled to demand that his belief be considered in any way relevant to the scientific endeavor.
The fact that both science and (Jewish or Christian) theology assume an independently existing reality (or "truth") doesn't mean that both pursuits necessarily have to--or are even remotely likely to--converge on the same version of it. They are in fact two decidedly different approaches to knowledge, and appreciating them both means recognizing that one has very little to say about the other, and that they needn't be--and probably can't be--fully reconciled. Then again, Easterbrook's mocking treatment of science makes it quite clear that he's not really interested in such a reconciliation at all. Rather, he'd like both to be governed by (his own) "common sense"--which is clearly more thoroughly informed by religion than by science.
Crooked Timber's Kieran Healy and Eric "Educated Guesswork" Rescorla have taken Easterbrook to task, on the perfectly reasonable grounds that string theory's idea of hidden dimensions is actually scientifically defensible, whereas Easterbrook's "spiritual dimension" is not exactly bursting with empirical or theoretical support. However, I believe they miss the more fundamental sense in which Easterbrook is off-base.
Imagine, for a moment, that we were to take Easterbrook seriously, and consider his "spiritual dimension" a valid scientific concept. What would we do? Well, we'd ask physicists to conduct experiments, work out theories, and generally explore the possibility. And let us suppose that they did so, and concluded that the "spiritual dimension" is in fact the seventh of the ten-or-eleven dimensions currently being proposed by string theorists, with its properties and behavior governed by such-and-such set of equations. Would Easterbrook be any happier?
Not at all. For the whole point of the "spiritual dimension" of which he speaks is that it's not describable in scientific terms. As he notes when discussing "intelligent design" theory (what Eric Rescorla correctly calls "warmed-over creationism"), "[w]hen it comes to intellectual rigidity, there's little difference between the national academy declaring that only natural forces may be considered, and the church declaring that only divine explanations may be considered."
Well, neither Easterbrook nor the church acknowledge the possibility that God might just be a boring old physical phenomenon, governed by a bunch of differential equations--but he likely doesn't see anything wrong with that. Such a claim would be incompatible with Christianity as just about every Christian understands it. Similarly, "intelligent design" and a "spiritual dimension" are not simply bad science, but rather non-science. Easterbrook believes there's more to the universe than science, and he's entitled to believe that. But he's not entitled to demand that his belief be considered in any way relevant to the scientific endeavor.
The fact that both science and (Jewish or Christian) theology assume an independently existing reality (or "truth") doesn't mean that both pursuits necessarily have to--or are even remotely likely to--converge on the same version of it. They are in fact two decidedly different approaches to knowledge, and appreciating them both means recognizing that one has very little to say about the other, and that they needn't be--and probably can't be--fully reconciled. Then again, Easterbrook's mocking treatment of science makes it quite clear that he's not really interested in such a reconciliation at all. Rather, he'd like both to be governed by (his own) "common sense"--which is clearly more thoroughly informed by religion than by science.
Tuesday, October 28, 2003
The New York Times apparently doesn't want Walter Duranty's 1932 Pulitzer Prize for journalism revoked. Times editor Bill Keller writes, "[a]s someone who spent time in the Soviet Union while it still existed, the notion of airbrushing history kind of gives me the creeps." Eugene Volokh disagrees: "No-one is suggesting that the Pulitzer people and the Times enter into some conspiracy to pretend that the award had never been given....This isn't airbrushing history; it's correcting error."
Volokh's argument is superficially appealing, and the Times is obviously not without a vested interest in seeking to avoid an embarrassing revocation of its reporter's honor. But I believe Keller's point actually reflects a more accurate view of such prizes in general, and the Pulitzers in particular. For in fact they do not represent today's consensus verdict on any particular past year's best reporting, but rather the consensus of the Pulitzer committee of the year in question. Moreover, it is doubtful that Duranty's is anywhere near alone among all past Pulitzers in being considered today to have been awarded "in error". And it is entirely possible that decades from now, several past Pulitzer winners will be seen to have been even more egregiously chosen than Duranty's.
Thus to revoke Duranty's prize today would be to imply--completely falsely--that the historical list of "unrescinded" Pulitzers reflects a current consensus regarding the best journalism of past years. Such a consensus would in fact be very difficult to achieve, and quite possibly less useful to catalog, in the end, than the choices of contemporaries reflected in the unaltered Pulitzer list.
That Duranty was awarded a Pulitzer is a blot upon the prize's reputation. The only purpose of rescinding the prize today would be to imply--again, completely falsely--that today's Pulitzer committee is somehow less prone to the shortsightedness that led to Duranty's being honored seventy years ago. Future generations are free to judge that for themselves, and would be foolish to rely on this year's committee to make the decision for them.
Volokh's argument is superficially appealing, and the Times is obviously not without a vested interest in seeking to avoid an embarrassing revocation of its reporter's honor. But I believe Keller's point actually reflects a more accurate view of such prizes in general, and the Pulitzers in particular. For in fact they do not represent today's consensus verdict on any particular past year's best reporting, but rather the consensus of the Pulitzer committee of the year in question. Moreover, it is doubtful that Duranty's is anywhere near alone among all past Pulitzers in being considered today to have been awarded "in error". And it is entirely possible that decades from now, several past Pulitzer winners will be seen to have been even more egregiously chosen than Duranty's.
Thus to revoke Duranty's prize today would be to imply--completely falsely--that the historical list of "unrescinded" Pulitzers reflects a current consensus regarding the best journalism of past years. Such a consensus would in fact be very difficult to achieve, and quite possibly less useful to catalog, in the end, than the choices of contemporaries reflected in the unaltered Pulitzer list.
That Duranty was awarded a Pulitzer is a blot upon the prize's reputation. The only purpose of rescinding the prize today would be to imply--again, completely falsely--that today's Pulitzer committee is somehow less prone to the shortsightedness that led to Duranty's being honored seventy years ago. Future generations are free to judge that for themselves, and would be foolish to rely on this year's committee to make the decision for them.
Monday, October 27, 2003
Both Oxblog's Patrick Belton and Glenn "Instapundit" Reynolds have concluded that a fascinating first-hand account of travelling through Cuba is in fact a searing indictment of that country's economic mismanagement and political repression. Oddly enough, I got a very different impression of the place from the piece. The Cuban countryside, it seems, is sadly poverty-stricken and backward. But it's not so much repressed as resigned--the people lead their simple, spartan, difficult lives as best they can, and many of them believe the government's propaganda simply because it's the natural thing to do. It is only those with unusual ambition, energy and initiative who chafe under the heavy hand of governmental regimentation. And what fraction of the population fits that description is far from clear.
Now, I don't deny for a second that Cuba would be better off with a freer economy and polity. I believe that idyllic ignorance does not exist, and that those who are satisfied with their lives only because they don't know of better alternatives are better off enlightened--even if they may end up sadder as a result. To me, knowledge and understanding are the essence of humanness, and mere uncomprehending bliss is no substitute.
But taking a moral position against such ignorance is not the same as objectively failing to recognize its existence, even its prevalence, in benighted corners of the world. Few peoples, for example, suffer more from their government than the North Koreans, who live under a regime of unparalleled cruelty and incompetence, who have repeatedly faced famine and starvation over the years, and who can be executed merely for trying to escape their vast national prison. Yet it's unlikely that all that many North Koreans, heavily regimented and information-starved as they are, can see through the official government's lies and blame their "Dear Leader", instead of the regime's numerous claimed foreign enemies, for the horrible conditions in which they live.
In fact, ignorance isn't even a prerequisite for ruinous political self-delusion. The Palestinians who continue to endorse the terrorist "armed struggle" that has impoverished and immiserated them, the former East Germans wallowing in "Ostalgie", the neo-luddite Westerners who shun the marvels of modern medicine, modern agriculture, and other modern technologies that have so improved their material living conditions--all are choosing to embrace choices that are objectively associated with severe hardship. It would hardly be surprising if many Cubans, insulated as they are from information about conditions elsewhere, choose to believe that their lives, their country and their leaders are every bit as great--relatively speaking, at least--as their government tells them.
Now, I don't deny for a second that Cuba would be better off with a freer economy and polity. I believe that idyllic ignorance does not exist, and that those who are satisfied with their lives only because they don't know of better alternatives are better off enlightened--even if they may end up sadder as a result. To me, knowledge and understanding are the essence of humanness, and mere uncomprehending bliss is no substitute.
But taking a moral position against such ignorance is not the same as objectively failing to recognize its existence, even its prevalence, in benighted corners of the world. Few peoples, for example, suffer more from their government than the North Koreans, who live under a regime of unparalleled cruelty and incompetence, who have repeatedly faced famine and starvation over the years, and who can be executed merely for trying to escape their vast national prison. Yet it's unlikely that all that many North Koreans, heavily regimented and information-starved as they are, can see through the official government's lies and blame their "Dear Leader", instead of the regime's numerous claimed foreign enemies, for the horrible conditions in which they live.
In fact, ignorance isn't even a prerequisite for ruinous political self-delusion. The Palestinians who continue to endorse the terrorist "armed struggle" that has impoverished and immiserated them, the former East Germans wallowing in "Ostalgie", the neo-luddite Westerners who shun the marvels of modern medicine, modern agriculture, and other modern technologies that have so improved their material living conditions--all are choosing to embrace choices that are objectively associated with severe hardship. It would hardly be surprising if many Cubans, insulated as they are from information about conditions elsewhere, choose to believe that their lives, their country and their leaders are every bit as great--relatively speaking, at least--as their government tells them.
Sunday, October 26, 2003
The case of Terri Schiavo, a profoundly brain-damaged Florida woman whose husband wants to disconnect her feeding tube so that she may die--and whose parents and siblings want to keep her alive--is one of those hard cases that ought to give even the most self-assured moral thinker pause to consider other perspectives. Ideally, one might hope for a rich public debate on the subject, leading to a democratic consensus as to the necessary conditions for concluding that someone ought not be kept alive, and the proper role of various parties' preferences in determining that decision.
At least that's what one would hope for if one were not a fanatical devotee of American Legal Religion. But here's Slate's Dahlia Lithwick foaming at the mouth over the people's representatives' unbearable uppitiness in daring to pronounce on the issue:
At least that's what one would hope for if one were not a fanatical devotee of American Legal Religion. But here's Slate's Dahlia Lithwick foaming at the mouth over the people's representatives' unbearable uppitiness in daring to pronounce on the issue:
Whether one believes that Terri Schiavo is in a "persistent vegetative state" or a "minimally conscious state" is immaterial. Whether one believes that her blinks and smiles are signs of cognition or automated reflexes is similarly not the issue. All that matters is that these disputes are governed by law, that the law says Michael Schiavo is her legal guardian, and that his decision ought to have been final.To Lithwick, the courts make the law; any so-called "law" produced by mere legislators is insolent usurpation of judicial prerogatives, and should be ignored. One wonders how Americans can ever hope to establish democracy in Iraq, when this kind of rank contempt for it is so commonplace back home.
Since 1990, when the Supreme Court decided Cruzan v. Missouri Department of Health, there has been a constitutionally protected right to decline unwanted medical procedures. How does the Florida Legislature justify overriding that decision and its own Constitution—which guarantees a right to privacy and allows residents or their legal guardians to terminate life support—by enacting a "law" that expressly violates that right? And how dare Jeb Bush call for the appointment of a new guardian for Schiavo? The courts have already named one—her husband.
Thursday, October 23, 2003
It's not terribly surprising that Malaysian Prime Minister Mahathir Mohamad turned his speech to the Islamic Summit Conference into a foul anti-Semitic diatribe about how "the Jews rule this world by proxy". After all, he has a long history of anti-Semitic pronouncements. Nor is it exactly shocking that the delegates attending the conference gave him a standing ovation. After all, anti-Semitic hate literature has been a staple in public discourse throughout most of the Islamic world for years. What was more interesting, however, was the occasional reaction--from the New York Times' Paul Krugman, for example--to the effect that, sure, the guy may be a wild-eyed Jew-hater, but as Muslim leaders go, at least he makes the trains run on time.
Now, I'm normally the last one in the world to trot out such trite analogies to World-War-II fascism. But in this case, I think the chairman of the Anti-Defamation League has it exactly right. The reason the analogy is appropriate is that although few people today are aware of it, those who made excuses for Hitler, Stalin, Mussolini and the rest--at the beginning at least--were hardly all moral monsters or clueless ignorami. Rather, they looked at the unappetizing smorgasbord of potential leaders available for a collection of seemingly ungovernable countries in the throes of economic collapse and social chaos, and decided that a bit of over-the-top racial rhetoric was a small price to pay. Think of it as a geopolitical version of Moynihan's concept of "defining deviancy down": When an entire region's leadership consists of ruthless, deluded incompetents, then a ruthless, deluded competent seems appealing by comparison.
The problem with this reasoning, of course, is that ruthless, deluded leaders never stay competent for long. Eventually, demagogues who flirt with violent rhetoric reach the point of being compelled to live up to their bombast, with disastrous results. Gamal Abdel Nasser, to choose one famous example, rode his fiery brand of Pan-Arabist militancy to de facto leadership of the entire Arab world in the 1950s. But he found himself trapped by his own ideology in 1967, failed to back down from a recklessly escalating confrontation with Israel, and ended up provoking his own devastating military defeat.
We don't know where the Malaysian Prime Minister's rhetoric will lead him--or perhaps his successors--but chances are that it will not be particularly good for Malaysia. No leader is perfect, of course, and dictators are generally less perfect than most. (Democracy provides a useful quality control mechanism, if nothing else.) But among leadership flaws, the willingness to discount reality entirely, repeatedly, and in public, is a lot more serious than Krugman et al. seem to recognize.
Now, I'm normally the last one in the world to trot out such trite analogies to World-War-II fascism. But in this case, I think the chairman of the Anti-Defamation League has it exactly right. The reason the analogy is appropriate is that although few people today are aware of it, those who made excuses for Hitler, Stalin, Mussolini and the rest--at the beginning at least--were hardly all moral monsters or clueless ignorami. Rather, they looked at the unappetizing smorgasbord of potential leaders available for a collection of seemingly ungovernable countries in the throes of economic collapse and social chaos, and decided that a bit of over-the-top racial rhetoric was a small price to pay. Think of it as a geopolitical version of Moynihan's concept of "defining deviancy down": When an entire region's leadership consists of ruthless, deluded incompetents, then a ruthless, deluded competent seems appealing by comparison.
The problem with this reasoning, of course, is that ruthless, deluded leaders never stay competent for long. Eventually, demagogues who flirt with violent rhetoric reach the point of being compelled to live up to their bombast, with disastrous results. Gamal Abdel Nasser, to choose one famous example, rode his fiery brand of Pan-Arabist militancy to de facto leadership of the entire Arab world in the 1950s. But he found himself trapped by his own ideology in 1967, failed to back down from a recklessly escalating confrontation with Israel, and ended up provoking his own devastating military defeat.
We don't know where the Malaysian Prime Minister's rhetoric will lead him--or perhaps his successors--but chances are that it will not be particularly good for Malaysia. No leader is perfect, of course, and dictators are generally less perfect than most. (Democracy provides a useful quality control mechanism, if nothing else.) But among leadership flaws, the willingness to discount reality entirely, repeatedly, and in public, is a lot more serious than Krugman et al. seem to recognize.
Wednesday, October 22, 2003
The folks at the Volokh Conspiracy and "Crooked Timber" are discussing their favorite (and least favorite) bumperstickers. I can't resist mentioning one I saw recently: it was bright red, and bore the slogan, "if this sticker is blue, you're driving too fast."
Well, Oxblog's Josh Chafetz liked it.
Well, Oxblog's Josh Chafetz liked it.
An update to the Easterbrook story: Easterbrook has apologized, and The New Republic has also apologized to its readers, excoriating Easterbrook's comments, but accepting his apology and defending him against charges of anti-Semitism. Meanwhile, ESPN has summarily fired Easterbrook, who used to write an excellent weekly football column for them. (It's unclear, though, whether Easterbrook's crime in their eyes was singling out Jews, or merely singling out Disney CEO Michael Eisner, whose empire includes ESPN.)
Many pixels have since been rendered on this matter, including by Meryl Yourish (who accepts Easterbrook's apology) and Roger Simon (who accepts it with reservations). Perhaps the most interesting take is Mickey Kaus'--he thinks that Easterbrook simply got carried away with his rhetoric, recklessly casting about for any argument that came to mind, and stumbled on an offensive one. (Kaus who is a former journalistic colleague of Easterbrook's, also recalls once making a similar blunder himself.)
My own best guess as to Easterbrook's thinking (not that it necessarily matters) is that he appears to be something of an "Old Testament Christian" who thinks of his stern religiously-derived morality as common to both the Christian and Jewish traditions. (He writes in his apology of belonging to a joint Jewish-Christian congregation, of emphasizing the Jewish roots of Christianity, and of berating the Jewish producers of violent movies in exactly the same way as he did the Catholic Mel Gibson for the same transgression.) He therefore felt entitled to address the two Jewish film producers, not just as fellow Americans, but also as quasi-co-religionists.
His key error, then, was to forget that he's not Jewish, and hence that if he tries to tell certain Jews how they should behave as Jews (rather than simply as people), he's walking straight onto a moral minefield. A Jew haranguing other Jews who make violent films by invoking the lessons of the Holocaust would merely be engaging in incoherent sophistry. But a Christian scolding Jews for failing to learn the lessons of the Holocaust cannot but sound suspiciously as though he's threatening a future refresher course.
Many pixels have since been rendered on this matter, including by Meryl Yourish (who accepts Easterbrook's apology) and Roger Simon (who accepts it with reservations). Perhaps the most interesting take is Mickey Kaus'--he thinks that Easterbrook simply got carried away with his rhetoric, recklessly casting about for any argument that came to mind, and stumbled on an offensive one. (Kaus who is a former journalistic colleague of Easterbrook's, also recalls once making a similar blunder himself.)
My own best guess as to Easterbrook's thinking (not that it necessarily matters) is that he appears to be something of an "Old Testament Christian" who thinks of his stern religiously-derived morality as common to both the Christian and Jewish traditions. (He writes in his apology of belonging to a joint Jewish-Christian congregation, of emphasizing the Jewish roots of Christianity, and of berating the Jewish producers of violent movies in exactly the same way as he did the Catholic Mel Gibson for the same transgression.) He therefore felt entitled to address the two Jewish film producers, not just as fellow Americans, but also as quasi-co-religionists.
His key error, then, was to forget that he's not Jewish, and hence that if he tries to tell certain Jews how they should behave as Jews (rather than simply as people), he's walking straight onto a moral minefield. A Jew haranguing other Jews who make violent films by invoking the lessons of the Holocaust would merely be engaging in incoherent sophistry. But a Christian scolding Jews for failing to learn the lessons of the Holocaust cannot but sound suspiciously as though he's threatening a future refresher course.
Wednesday, October 15, 2003
Gregg Easterbrook, of all people, has echoed--and in The New Republic, of all places--a standard, and extremely stupid, criticism routinely levied at Jews. Complaining about brutal violence in Hollywood movies, he writes,
But the real peak of Easterbrook's illogic is his reference to "[r]ecent European History" as an argument against Jews "glorifying the killing of the helpless". In fact, the only inference that Jews in particular are better placed than Gentiles to draw from the Holocaust, by virtue of their experience of suffering, is the practical observation that "glorifying the killing of the helpless" is, all things considered, a lot safer than failing to do so. And presumably that's not the lesson Easterbrook has in mind.
There are also a great many more morally salutary lessons to be drawn from the Holocaust, of course--but these are not lessons that Jews are particularly well-placed to receive, or needful of receiving. (One never hears, for example, assertions that the Palestinian Arabs' history of collective misfortune should have taught them the horrors of militant nationalism.) On the contrary, it is the decendants of 20th-century Europe's Gentile brutalizers of Jews, rather than Jews themselves, who stand to learn the most from the negative example of their ancestors. After all, if there's no such a thing as a cultural propensity towards some kind of behavior--say, racist violence--then Jews have no more or less reason to worry about emulating their ancestors' murderers than anybody else. And if such cultural propensities do exist, then why would one expect to find dangerous ones in the cultures of the victims of past atrocities, rather than in those of the perpetrators?
I won't waste any time discussing the radioactive a-S-word, but for insulting irrationality alone, Easterbrook deserves all the opprobrium Meryl Yourish and Roger Simon can throw at him.
Disney's CEO, Michael Eisner, is Jewish; the chief of Miramax, Harvey Weinstein, is Jewish. Yes, there are plenty of Christian and other Hollywood executives who worship money above all else, promoting for profit the adulation of violence. Does that make it right for Jewish executives to worship money above all else, by promoting for profit the adulation of violence? Recent European history alone ought to cause Jewish executives to experience second thoughts about glorifying the killing of the helpless as a fun lifestyle choice.Now, these two film executives' religion are obviously staggeringly irrelevant to the topic at hand. Why not their race? Their sex? Their belly button orientation? The first part of Easterbrook's entirely vacuous argument makes equal sense--that is, none at all--if any of these categories are used in place of faith. (Yes, there are awful people with "outies", but does that excuse the behavior of these "innies"?)
But the real peak of Easterbrook's illogic is his reference to "[r]ecent European History" as an argument against Jews "glorifying the killing of the helpless". In fact, the only inference that Jews in particular are better placed than Gentiles to draw from the Holocaust, by virtue of their experience of suffering, is the practical observation that "glorifying the killing of the helpless" is, all things considered, a lot safer than failing to do so. And presumably that's not the lesson Easterbrook has in mind.
There are also a great many more morally salutary lessons to be drawn from the Holocaust, of course--but these are not lessons that Jews are particularly well-placed to receive, or needful of receiving. (One never hears, for example, assertions that the Palestinian Arabs' history of collective misfortune should have taught them the horrors of militant nationalism.) On the contrary, it is the decendants of 20th-century Europe's Gentile brutalizers of Jews, rather than Jews themselves, who stand to learn the most from the negative example of their ancestors. After all, if there's no such a thing as a cultural propensity towards some kind of behavior--say, racist violence--then Jews have no more or less reason to worry about emulating their ancestors' murderers than anybody else. And if such cultural propensities do exist, then why would one expect to find dangerous ones in the cultures of the victims of past atrocities, rather than in those of the perpetrators?
I won't waste any time discussing the radioactive a-S-word, but for insulting irrationality alone, Easterbrook deserves all the opprobrium Meryl Yourish and Roger Simon can throw at him.
Thursday, October 09, 2003
Mickey Kaus and Andrew Sullivan are strangely giddy over the news: a state with a strong populist tradition has elected, by a plurality in a multi-way race, a charismatic, muscle-bound showbiz figure with a weak political track record, a somewhat tainted sexual past, and a suspiciously vague policy platform. I suppose Kaus and Sullivan are buoyed by their memories of the spectacularly successful result the last time this happened....
The key factors in the above discription, of course, are "weak political track record" and "suspiciously vague policy platform". It's a common myth in democratic countries that a good, decent, intelligent leader can simply look at each of the day's issues, choose the sensible position to take in every case, and then take it, thus winning the accolades of a grateful populace imbued with the same straighforward good sense. Unfortunately, the public are neither particularly imbued with good sense, nor inclined to use what little of it they have to overcome their own selfish interests, prejudices and superstitions.
Fortunately, though, democracy doesn't need a particularly wise, high-minded electorate. All it needs is a collection of people with enough vigorously competing interests, prejudices and superstitions that the exhausting task of brokering among them tends to impede leaders from causing any grievous harm. A really superb leader can even spot a few nuggets of popular consensus hidden in the cacophony, and cater to them--usually, though not always, to the good.
What almost always sinks the "outsider" candidate is his or her inability to recognize and take advantage of those instances of consensus. Elected to knock some sense into slimy, business-as-usual politicians, the outsider typically believes him- or herself to have a deep, natural rapport with the common people that allows him or her to disdain pandering and poll-taking and simply intuit what the public wants. Of course, such a person inevitably confuses "what the public wants" with what he or she personally wants, and ends up spearheading unpopular campaigns on behalf of hobbyhorse causes.
Perhaps the latest outsider candidate, faced with a colossal state budget deficit, a hostile state legislature, and a choppy economy, can rapidly acquire the skills that more polished politicians take years to hone, and build a solid constituency for various popular initiatives while offending as few voters as possible. And with roughly the same likelihood, the politicians he defeated might take up bodybuilding and learn to excel at the art of playing a Hollywood action film hero.
The key factors in the above discription, of course, are "weak political track record" and "suspiciously vague policy platform". It's a common myth in democratic countries that a good, decent, intelligent leader can simply look at each of the day's issues, choose the sensible position to take in every case, and then take it, thus winning the accolades of a grateful populace imbued with the same straighforward good sense. Unfortunately, the public are neither particularly imbued with good sense, nor inclined to use what little of it they have to overcome their own selfish interests, prejudices and superstitions.
Fortunately, though, democracy doesn't need a particularly wise, high-minded electorate. All it needs is a collection of people with enough vigorously competing interests, prejudices and superstitions that the exhausting task of brokering among them tends to impede leaders from causing any grievous harm. A really superb leader can even spot a few nuggets of popular consensus hidden in the cacophony, and cater to them--usually, though not always, to the good.
What almost always sinks the "outsider" candidate is his or her inability to recognize and take advantage of those instances of consensus. Elected to knock some sense into slimy, business-as-usual politicians, the outsider typically believes him- or herself to have a deep, natural rapport with the common people that allows him or her to disdain pandering and poll-taking and simply intuit what the public wants. Of course, such a person inevitably confuses "what the public wants" with what he or she personally wants, and ends up spearheading unpopular campaigns on behalf of hobbyhorse causes.
Perhaps the latest outsider candidate, faced with a colossal state budget deficit, a hostile state legislature, and a choppy economy, can rapidly acquire the skills that more polished politicians take years to hone, and build a solid constituency for various popular initiatives while offending as few voters as possible. And with roughly the same likelihood, the politicians he defeated might take up bodybuilding and learn to excel at the art of playing a Hollywood action film hero.
Wednesday, October 08, 2003
Why is America so hated around the world, and particularly in the Middle East? According to an "advisory group" of alleged Middle East experts, the problem is a lack of "public diplomacy"--i.e., insufficient pro-American PR. Michael Holtzman argues instead for more retail generosity--"doctors, teachers, businesses, religious leaders, athletic teams and entertainers" helping and bonding with the inhabitants of the region. In other words, both writers imagine folks in that part of the world scratching their heads, contemplating their last personal encounter with something or someone American, and forming their geopolitical judgments accordingly.
One wonders if any of these people have bothered to consider how their own countrymen form their opinions of foreign countries. The only major public relations campaign I've heard of initiated by a foreign country in the US was undertaken by Saudi Arabia, and I doubt it's done much good. And how many Americans, really, think in terms of their personal contact with, say, French, or British, or Israeli visitors when deciding on their attitudes towards those countries?
On the contrary, Americans' opinions of other countries are an extension of their general political views, and we should expect non-Americans to form their opinions of America the same way. Fortunately, as a non-American, I can study the origins of anti-Americanism with a certain amount of dispassionate detachment. I discern at least three factors contributing to its steep rise:
As with anti-Israel sentiment, anti-Americanism is to a great extent a function of local partisan conflicts. In Canada, for example--the country with which I'm most familiar--anti-Americanism is strongly correlated with liberal (as opposed to conservative) leanings, Eastern (as opposed to Western) regional loyalties, and an educated or intellectual class affiliation.
Something similar is likely happening in the Middle East as well. Bernard Lewis has written that there are two types of countries in the Middle East: those (e.g., Egypt, Saudi Arabia) where the government is allied with America, and the population is virulently anti-American, and those (e.g., Iran, Iraq under Saddam)where the government is virulently anti-American, and the public is enthusiastically pro-US. In other words, Middle Eastern anti-Americanism is, in his view, simply an extension of local dissatisfaction with oppressive national governments that happen to be friendly with America. I'm not sure this analysis is necessarily 100% accurate, but I expect that it's closer to the mark than any correlation between popular sentiment and American ads or Peace Corps volunteers.
Like every other political event, the end of the Cold War took time to sink into the world's consciousness. The slow-but-steady rise in worldwide anti-Americanism over the last decade or so is partly a reflection of--and a reaction to--everyone's gradual realization that America is globally dominant in a way it was not when it was in competition with the Soviet Union. The most powerful country may get the lion's share of the world's respect--but it will inevitably also get its share of the world's resentment, as well.
The upsurge in globalization during the nineties was wonderful for the world's economies. But any such boom inevitably causes rapid changes and dislocations, and thus provokes a serious backlash from those who were harmed--or who simply feel uncomfortable with and disoriented by all the upheaval, however lucrative. Most of the anti-American unrest we see around the world today--from Islamist terrorism to anti-globalist activism in the developed world--is of a romantic, anti-materialist, anti-modernist cast, railing against commerce, technology and luxury rather than tyranny, corruption and impoverishment. America is, of course, the world's primary symbol of the former list of "ills".
I'm sure there are more factors behind the global surge in anti-Americanism than just these three. But lack of cheerleading TV spots or earnest aid programs certainly isn't among them.
One wonders if any of these people have bothered to consider how their own countrymen form their opinions of foreign countries. The only major public relations campaign I've heard of initiated by a foreign country in the US was undertaken by Saudi Arabia, and I doubt it's done much good. And how many Americans, really, think in terms of their personal contact with, say, French, or British, or Israeli visitors when deciding on their attitudes towards those countries?
On the contrary, Americans' opinions of other countries are an extension of their general political views, and we should expect non-Americans to form their opinions of America the same way. Fortunately, as a non-American, I can study the origins of anti-Americanism with a certain amount of dispassionate detachment. I discern at least three factors contributing to its steep rise:
Something similar is likely happening in the Middle East as well. Bernard Lewis has written that there are two types of countries in the Middle East: those (e.g., Egypt, Saudi Arabia) where the government is allied with America, and the population is virulently anti-American, and those (e.g., Iran, Iraq under Saddam)where the government is virulently anti-American, and the public is enthusiastically pro-US. In other words, Middle Eastern anti-Americanism is, in his view, simply an extension of local dissatisfaction with oppressive national governments that happen to be friendly with America. I'm not sure this analysis is necessarily 100% accurate, but I expect that it's closer to the mark than any correlation between popular sentiment and American ads or Peace Corps volunteers.
I'm sure there are more factors behind the global surge in anti-Americanism than just these three. But lack of cheerleading TV spots or earnest aid programs certainly isn't among them.
Thursday, September 25, 2003
Glenn "Instapundit" Reynolds is leading the charge against media misrepresentation of current conditions in Iraq. I'm not sure why he's harping on this particular issue, given that he doesn't seem to be all that convinced of the accuracy of media representations of current conditions in America, either.
For that matter, it's not even clear that the notion of "current conditions in America" (let alone in Iraq) can even be defined, independent of media representations of it. In huge, diverse countries populated by millions of people, it's extraordinarily difficult to characterize "current conditions" in any meaningful way. In modern democratic societies, there are enormous industries--journalism, polling/market research, electoral politics--dedicated to gauging "current conditions" and catering to them for financial or political benefit. The outputs of these industries--press reports, advertising, election results--can then be used to infer a reasonable aggregate picture of the society's beliefs, concerns and interests. But in non-democratic societies, these mechanisms don't work, and any attempt to construct a substitute from the scanty evidence available is doomed to be hopelessly distorted.
Historical descriptions of ancient and medieval societies, for example, tend to concentrate overwhelmingly on a tiny fraction of the population--the ruling elite--whose lives and actions generally had a negligible effect on the vast majority of the population. However, because these groups had control of the only means available for propagating information about their "current conditions", their surviving stories eventually became a proxy for "current conditions" (at the time) in those societies. Something similar may be happening in Iraq today: because the occupying American forces are largely in control of the flow of information in the country, "current conditions" in Iraq are a function of the perceptions and concerns of those troops. Since those troops are naturally highly preoccupied with the rate of guerrilla attacks on their comrades, these events are portrayed in the press as a key criterion for evaluating "current conditions" on the ground--even though most Iraqis probably care little about them.
In time, Iraqi society will no doubt reach the point where its institutions convey a coherent picture of "current conditions" there. (Whether that picture is one of rigid fealty to an absolute ruler, chaos and civil strife, or something more akin to the peaceful freedom of modern democratic states remains to be seen, of course.) Until then, however, complaining that the foreign media's portrayal of "current conditions" in Iraq is inaccurate--when the fragmented, shoestring local media are themselves far from agreeing on one--makes little sense. One would do better to complain about foreign media purporting to portray "current conditions" in Iraq in the first place.
For that matter, it's not even clear that the notion of "current conditions in America" (let alone in Iraq) can even be defined, independent of media representations of it. In huge, diverse countries populated by millions of people, it's extraordinarily difficult to characterize "current conditions" in any meaningful way. In modern democratic societies, there are enormous industries--journalism, polling/market research, electoral politics--dedicated to gauging "current conditions" and catering to them for financial or political benefit. The outputs of these industries--press reports, advertising, election results--can then be used to infer a reasonable aggregate picture of the society's beliefs, concerns and interests. But in non-democratic societies, these mechanisms don't work, and any attempt to construct a substitute from the scanty evidence available is doomed to be hopelessly distorted.
Historical descriptions of ancient and medieval societies, for example, tend to concentrate overwhelmingly on a tiny fraction of the population--the ruling elite--whose lives and actions generally had a negligible effect on the vast majority of the population. However, because these groups had control of the only means available for propagating information about their "current conditions", their surviving stories eventually became a proxy for "current conditions" (at the time) in those societies. Something similar may be happening in Iraq today: because the occupying American forces are largely in control of the flow of information in the country, "current conditions" in Iraq are a function of the perceptions and concerns of those troops. Since those troops are naturally highly preoccupied with the rate of guerrilla attacks on their comrades, these events are portrayed in the press as a key criterion for evaluating "current conditions" on the ground--even though most Iraqis probably care little about them.
In time, Iraqi society will no doubt reach the point where its institutions convey a coherent picture of "current conditions" there. (Whether that picture is one of rigid fealty to an absolute ruler, chaos and civil strife, or something more akin to the peaceful freedom of modern democratic states remains to be seen, of course.) Until then, however, complaining that the foreign media's portrayal of "current conditions" in Iraq is inaccurate--when the fragmented, shoestring local media are themselves far from agreeing on one--makes little sense. One would do better to complain about foreign media purporting to portray "current conditions" in Iraq in the first place.
Tuesday, September 23, 2003
Washington Post ombudsman Michael Getler has provided us with some telling insight into the mindset of journalists reporting on the Middle East. In explaining why the term "terrorist" is rarely used when describing so-called "militants" who launch murderous attacks on Israeli civilians, Getler implicitly concedes precisely what he is explicitly trying to deny--that the terminology his newspaper employs is determined by political judgments.
Some of his points are in fact well-taken--for example, that "[t]errorism and terrorist...[l]ike all labels....do not convey much hard information", and are often better replaced with more specific terms. It's also understandable that he'd prefer that his newspaper "not resolve the argument over whether Hamas is a terrorist organization", since "adopting particular language can suggest taking sides" in the political debate over the issue.
Or rather, it would be understandable if the ombudsman considered taking sides on political questions surrounding terrorism to be anathema. But Getler shows no such squeamishness when definitively distinguishing between, say, Al Qaida and Hamas.
Why all this bending over backwards to avoid being explicit about political judgments? Well, if Getler were to admit that he and his paper are taking a political position on this issue, then he would have to mount some kind of explicit defense of it. Unfortunately for him, his position--that the history of the region somehow lends a degree of legitimacy to Hamas' ongoing campaign of terrorist murder--is morally, logically and politically indefensible. Clearly, then, it's in his interest to pretend, however implausibly, that his blatantly political plea for consideration of "context" is instead an expression of impartial, objective neutrality.
Some of his points are in fact well-taken--for example, that "[t]errorism and terrorist...[l]ike all labels....do not convey much hard information", and are often better replaced with more specific terms. It's also understandable that he'd prefer that his newspaper "not resolve the argument over whether Hamas is a terrorist organization", since "adopting particular language can suggest taking sides" in the political debate over the issue.
Or rather, it would be understandable if the ombudsman considered taking sides on political questions surrounding terrorism to be anathema. But Getler shows no such squeamishness when definitively distinguishing between, say, Al Qaida and Hamas.
Hamas conducts terrorism but also has territorial ambitions, is a nationalist movement and conducts some social work. As far as we know, al Qaeda exists only as a terrorist network. It is composed of radicals from several Islamic countries. The Palestinian resistance is indigenous. Al Qaeda launched a devastating surprise attack on the United States. Israelis and Palestinians have been at war for a long time. Palestinians have been resisting a substantial and, to Palestinians, humiliating, Israeli occupation of the West Bank and Gaza since they were seized in the 1967 war.Now, Getler is free to believe, along with his employers, that these distinctions are real and meaningful. (Or he could be more like me, and consider these "distinctions" to be false, meaningless rationalizations for a shamefully spineless refusal to condemn terrorism.) But he cannot seriously claim that his position on the question is not political in nature. And indeed, he implicitly admits it:
That resistance has now bred suicide bombers. These are terrorist acts, not to be condoned. But the contexts of the struggle against al Qaeda and the Israeli-Palestinian conflict are different. News organizations should not back away from the word terrorism when it is the proper term. But as a rule, strong, descriptive, factual reporting is better than labels.In other words, labelling "terrorist acts" is a matter of objective description. But use of the label "terrorism" must take into account "context". (Readers can decide for themselves whether the context to which he refers is planetary, pedagogic, pharmaceutical--or perhaps some other word beginning with "p".)
Why all this bending over backwards to avoid being explicit about political judgments? Well, if Getler were to admit that he and his paper are taking a political position on this issue, then he would have to mount some kind of explicit defense of it. Unfortunately for him, his position--that the history of the region somehow lends a degree of legitimacy to Hamas' ongoing campaign of terrorist murder--is morally, logically and politically indefensible. Clearly, then, it's in his interest to pretend, however implausibly, that his blatantly political plea for consideration of "context" is instead an expression of impartial, objective neutrality.
Wednesday, September 17, 2003
Roger Simon (no relation) and Daniel Drezner have both noted the striking contrast between Christiane Amanpour's and John Burns' criticisms of American press coverage of the Iraq war. Both of these journalists agree that the coverage was compromised by heavy-handed pressure, but Burns points to the former Iraqi regime as the culprit, whereas Amanpour accuses the Bush administration and--believe it or not--Fox News--of being responsible for the "muzzled" press in Iraq.
Needless to say, Burns comes off looking far better in this comparison, simply because he's obviously much closer to the mark than Amanpour. For one thing, his claim of journalistic obsequiousness to Saddam Hussein's regime has been documented elsewhere, whereas Amanpour gives no evidence that the White House (let alone Fox News) successfully pressured CNN to change its coverage in any significant way.
But there's a similarity of tone, and even of substance, in these two reporters' somewhat over-the-top remarks that I think deserves more attention. Both see themselves, first and foremost, as deliverers of an important message that's not being heard because of nefarious attempts to suppress it. Both revel in the drama of their own role as speaker of truths that powerful people wish unheard. And both seem to care more about overall themes than about specific facts and events. The two seem to see themselves as, in a word, storytellers, for better or worse, observing a romantic tale in the making before their eyes, and recounting it with flair and passion (not to mention self-flattery).
The traditional model of the journalist--at least in the domestic sphere--is very different: a hard-bitten cynic who believes no one, his job being to uncover the hard, unpleasant facts that everyone would rather not hear. This journalist is neither glamorous nor daring; vaguely despised by all, he roots around among his sources until he uncovers the ugly facts that the journalist's reading public needs to know for its own protection, delivering them with hard, skeptical bluntness.
Of course, the foreign correspondent has always been a far more romantic figure, in the Burns-Amanpour mold. It's worth asking, though, if this ideal is as effective at keeping an audience practically informed as the domestic one. A nose for a thrilling yarn is not, after all, the same as a nose for the pertinent facts on the ground. Perhaps the inevitable price of the occasional dedicated, indefatigable, and (fortunately) essentially accurate John Burns is a profession dominated by preening, melodramatic and deeply confused Christiane Amanpours.
Needless to say, Burns comes off looking far better in this comparison, simply because he's obviously much closer to the mark than Amanpour. For one thing, his claim of journalistic obsequiousness to Saddam Hussein's regime has been documented elsewhere, whereas Amanpour gives no evidence that the White House (let alone Fox News) successfully pressured CNN to change its coverage in any significant way.
But there's a similarity of tone, and even of substance, in these two reporters' somewhat over-the-top remarks that I think deserves more attention. Both see themselves, first and foremost, as deliverers of an important message that's not being heard because of nefarious attempts to suppress it. Both revel in the drama of their own role as speaker of truths that powerful people wish unheard. And both seem to care more about overall themes than about specific facts and events. The two seem to see themselves as, in a word, storytellers, for better or worse, observing a romantic tale in the making before their eyes, and recounting it with flair and passion (not to mention self-flattery).
The traditional model of the journalist--at least in the domestic sphere--is very different: a hard-bitten cynic who believes no one, his job being to uncover the hard, unpleasant facts that everyone would rather not hear. This journalist is neither glamorous nor daring; vaguely despised by all, he roots around among his sources until he uncovers the ugly facts that the journalist's reading public needs to know for its own protection, delivering them with hard, skeptical bluntness.
Of course, the foreign correspondent has always been a far more romantic figure, in the Burns-Amanpour mold. It's worth asking, though, if this ideal is as effective at keeping an audience practically informed as the domestic one. A nose for a thrilling yarn is not, after all, the same as a nose for the pertinent facts on the ground. Perhaps the inevitable price of the occasional dedicated, indefatigable, and (fortunately) essentially accurate John Burns is a profession dominated by preening, melodramatic and deeply confused Christiane Amanpours.
Michael O'Hanlon has produced another in what seems like an endless stream of op-eds arguing for an ambitious approach to North Korea negotiations. The idea is to offer everything to North Korea, but to demand a lot of concessions (or promises, at least) in return. For example, the US could offer full recognition and security guarantees to the North, along with generous food aid and other assistance. Kim Jong-Il and friends would be expected in return to refrain from going nuclear, to cut back the size of their military, to reduce their weapons exports and to begin opening up their country's economy.
Sounds like a great idea--all that's missing is the right venue for the talks. I propose Oslo.
Sounds like a great idea--all that's missing is the right venue for the talks. I propose Oslo.
Thursday, September 11, 2003
The Polish science fiction writer Stanislaw Lem once wrote a story about a planet where it is decided that all the inhabitants shall live and breathe underwater. The tale is an obvious satire of Communist utopianism, but its crucial lesson--that deciding on idealized ends, irrespective of either the practicality or the morality of the means to them, is a sure path to disaster--has unfortunately never really been absorbed by Western intellectuals. On the contrary, recent political philosophy has been dominated by discussions roughly as absurd as whether we should all really be breathing underwater.
Consider, for example, distributive-justice.com, a Website (independently endorsed by two different members of the "Crooked Timber" collective) devoted to cataloguing and explaining the prominent schools of thought on the question of "distributive justice". This question was popularized by the late John Rawls, a widely revered philosopher most famous for positing the following thought experiment: imagine that you are permitted to design, top to bottom, the rules of operation for a society, with the proviso that you would then be placed in that society, in a "position" (role, social status, economic status, etc.) as yet unknown to you, and not of your choosing. How would you decide, for example, to order the allocation of wealth? Of honor? Of power? Rawls argues that the best strategy in this experiment would be to design something like a modern egalitarian welfare state, with a generous safety net to guard against the possibility of being cast in the role of indigent. Others, of course, have proposed alternative strategies, and distributive-justice.com outlines a few.
Well, political philosophers may love this type of question, but to me it's of a piece with Lem's characters' pondering what they really should all be breathing. After all, nobody in real life is in a position to order a society per Rawls' experiment, and any order proposed under its conditions is thus a "pure end", blissfully disconnected from any means that might achieve it. Unanswered are such questions as, "how much change has to be imposed upon the current society to reach the desired one?" "What will be the practical effects on economic prosperity, political order, or social peace?" "How much suffering will result from the transition?" And, of course, my perennial favorite: "will the new order be imposed forcibly by a dictator, stealthily by a Platonic oligarchy, or democratically by a supportive populace?" Discussing the morality or practicality of one distributive end or another without considering these questions about the morality and practicality of the means is, in my opinion, mere idle game-playing, offering no useful moral insight whatsoever.
It is often said that "the ends justify (or do not justify) the means". In fact, neither statement is true. Ends may or may not justify the means, but more importantly, ends and means simply cannot be teased apart and dealt with separately in evaluating the morality of the combination. And it's not as if the folks at "Crooked Timber" are unaware of this principle--the well-known ethical exercises referred to as "trolley problems", discussed there, illustrate it perfectly. Somehow, though, the temptation to imagine a world of ends freed from the chains of their means always seems just too tempting for philosophers to ignore.
Consider, for example, distributive-justice.com, a Website (independently endorsed by two different members of the "Crooked Timber" collective) devoted to cataloguing and explaining the prominent schools of thought on the question of "distributive justice". This question was popularized by the late John Rawls, a widely revered philosopher most famous for positing the following thought experiment: imagine that you are permitted to design, top to bottom, the rules of operation for a society, with the proviso that you would then be placed in that society, in a "position" (role, social status, economic status, etc.) as yet unknown to you, and not of your choosing. How would you decide, for example, to order the allocation of wealth? Of honor? Of power? Rawls argues that the best strategy in this experiment would be to design something like a modern egalitarian welfare state, with a generous safety net to guard against the possibility of being cast in the role of indigent. Others, of course, have proposed alternative strategies, and distributive-justice.com outlines a few.
Well, political philosophers may love this type of question, but to me it's of a piece with Lem's characters' pondering what they really should all be breathing. After all, nobody in real life is in a position to order a society per Rawls' experiment, and any order proposed under its conditions is thus a "pure end", blissfully disconnected from any means that might achieve it. Unanswered are such questions as, "how much change has to be imposed upon the current society to reach the desired one?" "What will be the practical effects on economic prosperity, political order, or social peace?" "How much suffering will result from the transition?" And, of course, my perennial favorite: "will the new order be imposed forcibly by a dictator, stealthily by a Platonic oligarchy, or democratically by a supportive populace?" Discussing the morality or practicality of one distributive end or another without considering these questions about the morality and practicality of the means is, in my opinion, mere idle game-playing, offering no useful moral insight whatsoever.
It is often said that "the ends justify (or do not justify) the means". In fact, neither statement is true. Ends may or may not justify the means, but more importantly, ends and means simply cannot be teased apart and dealt with separately in evaluating the morality of the combination. And it's not as if the folks at "Crooked Timber" are unaware of this principle--the well-known ethical exercises referred to as "trolley problems", discussed there, illustrate it perfectly. Somehow, though, the temptation to imagine a world of ends freed from the chains of their means always seems just too tempting for philosophers to ignore.
Friday, September 05, 2003
In an old joke about the French, a trio of little boys is ambling through Paris when they spy an amorous couple through an open bedroom window. "Look," says the six-year-old, "they're fighting!" "Non," replies the eight-year-old, "they are making love!" "Oui," concludes the ten-year-old, "and rather badly."
I am reminded of this joke by the controversy over Frederic Beigbeder's new novel, "Windows on the World", which imagines the fates of the diners at the restaurant of the same name atop the World Trade Center on the morning of September 11th, 2001. In one widely-quoted passage, a collection of affluent, materialistic Americans, identified only by their designer attire, spend their last moments discussing their cars, homes and investments before losing themselves in a frenzy of sexual coupling as the flames rise around them.
In other words, Beigbeder paints these Americans as embodying the very crudest stereotypes of....the French. They revel in fine luxuries. They pay close attention to fashion. They value self-interested pragmatism before moral principle. And they embrace a sophisticated, carefree sexual hedonism. One would expect any Parisian to feel a warm glow of fellow-feeling when reading this description of Americans by a Frenchman. Why, then, would anyone interpret it as critical of--let alone insulting to--Americans?
The problem, it seems, is that the Americans in the passage are portrayed as French without the style. Their fashions are mass-market American fashions like Kenneth Cole and Ralph Lauren. Their luxuries are modern and unsophisticated: a Porsche, a villa in Hawaii, a health spa membership. Their venality is in the service of increased wealth rather than elevated social status. And they kiss "comme dans un bon porno californien".
Worst of all, they don't spend a single moment sneering contemptuously at those they deem culturally beneath them. For that alone, the Americans in the novel (and in real life) have clearly failed to live up to the Gallic standard, and have thus earned an eternity of callous French ridicule. Rather badly, indeed.
I am reminded of this joke by the controversy over Frederic Beigbeder's new novel, "Windows on the World", which imagines the fates of the diners at the restaurant of the same name atop the World Trade Center on the morning of September 11th, 2001. In one widely-quoted passage, a collection of affluent, materialistic Americans, identified only by their designer attire, spend their last moments discussing their cars, homes and investments before losing themselves in a frenzy of sexual coupling as the flames rise around them.
In other words, Beigbeder paints these Americans as embodying the very crudest stereotypes of....the French. They revel in fine luxuries. They pay close attention to fashion. They value self-interested pragmatism before moral principle. And they embrace a sophisticated, carefree sexual hedonism. One would expect any Parisian to feel a warm glow of fellow-feeling when reading this description of Americans by a Frenchman. Why, then, would anyone interpret it as critical of--let alone insulting to--Americans?
The problem, it seems, is that the Americans in the passage are portrayed as French without the style. Their fashions are mass-market American fashions like Kenneth Cole and Ralph Lauren. Their luxuries are modern and unsophisticated: a Porsche, a villa in Hawaii, a health spa membership. Their venality is in the service of increased wealth rather than elevated social status. And they kiss "comme dans un bon porno californien".
Worst of all, they don't spend a single moment sneering contemptuously at those they deem culturally beneath them. For that alone, the Americans in the novel (and in real life) have clearly failed to live up to the Gallic standard, and have thus earned an eternity of callous French ridicule. Rather badly, indeed.
Wednesday, September 03, 2003
In the course of his New York Times op-ed piece, former oil executive and self-proclaimed Mideast pundit Donald Hepburn, inadvertently makes an important observation about the current situation in Iraq:
And that's the problem--in order to have that confidence, the lenders need to see somebody at the top who is (a) willing to commit to repayment, (b) likely to stick around when the due date arrives, and (c) enticed by an incentive (like the prospect of further loans) to live up to the commitment. Right now, in Iraq, there simply isn't anyone who meets these criteria, and there likely won't be until the country has a new constitution and an elected government etc. etc. Why? Because the US has promised all this, and now cannot establish a plausible long-term governing authority based on anything less.
In other words, it is not really the absence of stable democratic rule in Iraq, as its (unfulfilled) promise, that's holding up the capital flow that could bankroll the country's reconstruction. Perhaps, then, the chorus of perfectionists who consider anything short of Switzerland-on-the-Euphrates a disastrous failure on the part of the American occupiers should consider the consequences of their position for the ordinary Iraqi, desperately hoping for some kind of effective government to end the chaos and begin the rebuilding in earnest.
Not that I wouldn't love to see Iraq join the club of vibrantly open, democratic societies, of course. But between that depressingly distant ideal outcome and the nightmare of Saddam Hussein, surely there's a lot of room for pragmatic compromise. And it's worth asking (as I already have, in fact) how many American--and Iraqi--lives are worth sacrificing for the sake of making "perfect" the enemy of "good enough".
Iraq will need long-term loans from the World Bank, the United Nations Iraq Development Fund, the British Foreign and Commonwealth Office, the Arab Development Fund, the European Union Aid Program and others. Yet few of these organizations will be keen to make loans until Iraq has a new constitution and an elected government that has put in place effective legal, arbitration, banking and fiscal systems.Now, where did these organizations get the idea that a country is an unfit borrower without all the fancy trappings of modern Western governments? In fact, these bodies lend buckets of money all the time to corrupt autocrats and brutal juntas running their countries ruthlessly into the ground. Far from being concerned about democratic legitimacy or well-structured institutions of governance, these lenders have only ever cared about the likelihood that if they sink a bucket of their cash into somebody's coffers, they will be paid back (more or less) on time.
And that's the problem--in order to have that confidence, the lenders need to see somebody at the top who is (a) willing to commit to repayment, (b) likely to stick around when the due date arrives, and (c) enticed by an incentive (like the prospect of further loans) to live up to the commitment. Right now, in Iraq, there simply isn't anyone who meets these criteria, and there likely won't be until the country has a new constitution and an elected government etc. etc. Why? Because the US has promised all this, and now cannot establish a plausible long-term governing authority based on anything less.
In other words, it is not really the absence of stable democratic rule in Iraq, as its (unfulfilled) promise, that's holding up the capital flow that could bankroll the country's reconstruction. Perhaps, then, the chorus of perfectionists who consider anything short of Switzerland-on-the-Euphrates a disastrous failure on the part of the American occupiers should consider the consequences of their position for the ordinary Iraqi, desperately hoping for some kind of effective government to end the chaos and begin the rebuilding in earnest.
Not that I wouldn't love to see Iraq join the club of vibrantly open, democratic societies, of course. But between that depressingly distant ideal outcome and the nightmare of Saddam Hussein, surely there's a lot of room for pragmatic compromise. And it's worth asking (as I already have, in fact) how many American--and Iraqi--lives are worth sacrificing for the sake of making "perfect" the enemy of "good enough".
Sunday, August 31, 2003
What's most shocking about Barak Barfi's Washington Post op-ed about recently-killed high-ranking Hamas official Ismail Abu Shanab is not that it describes him as "A True Palestinian Pragmatist", who "charted a middle path" and "hedged his bets". Nor is it that the article contrasts Palestinian Prime Minister Mahmoud Abbas' "one-sided conciliatory approach" unfavorably with that of "true pragmatists" like Abu Shanab, who by "threatening Israel....gain legitimacy among their constituency". Nor is it that the author is a visiting research fellow at the Hebrew University of Jerusalem, and a part-time proucer for ABC News, and that his portrayal of a leader of an Islamic fundamentalist terrorist group like Hamas as a "moderate" is given a respectable hearing in a major American newspaper.
No, what's most shocking about the column is that its main claim is objectively true. Abu Shanab, a leader in an organization dedicated to destroying Israel through terrorism, really was a moderate by Palestinian standards. He "often spoke of how the Zionist lobby controlled the United States"--but (probably) never stooped to spreading some of the wilder blood libels about Jews often propagated in the official Palestinian media. Although he embraced terrorism against Jewish civilians in Israel as a legitimate tactic, "[h]e tried to avoid praising suicide bombings and had difficulty justifying them." He "often said--albeit in a circumlocutory manner--that if the Israelis retreated to the June 4, 1967, lines, withdrew from East Jerusalem and allowed refugees from the 1948 war to return, peace would be possible." (That is, he demanded that Israel cease to be a majority-Jewish state, but did not call for the death or expulsion of all Jews from the region, as other Palestinian leaders have.)
True, he "did not have the courage and conviction to wholeheartedly denounce violence or stand up for peace"--but then, what Palestinian leader today would dare espouse such extremist views?
No, what's most shocking about the column is that its main claim is objectively true. Abu Shanab, a leader in an organization dedicated to destroying Israel through terrorism, really was a moderate by Palestinian standards. He "often spoke of how the Zionist lobby controlled the United States"--but (probably) never stooped to spreading some of the wilder blood libels about Jews often propagated in the official Palestinian media. Although he embraced terrorism against Jewish civilians in Israel as a legitimate tactic, "[h]e tried to avoid praising suicide bombings and had difficulty justifying them." He "often said--albeit in a circumlocutory manner--that if the Israelis retreated to the June 4, 1967, lines, withdrew from East Jerusalem and allowed refugees from the 1948 war to return, peace would be possible." (That is, he demanded that Israel cease to be a majority-Jewish state, but did not call for the death or expulsion of all Jews from the region, as other Palestinian leaders have.)
True, he "did not have the courage and conviction to wholeheartedly denounce violence or stand up for peace"--but then, what Palestinian leader today would dare espouse such extremist views?
Thursday, August 28, 2003
Readers who are familiar with my views on capital punishment, rough police conduct, and even (in extreme circumstances) torture, might expect my reaction to the death of former Catholic priest and convicted pedophile John Geoghan to resemble that of PMStyle's "Mr. PMS" rather than that of Ted Conover. The former (who appears not to support the death penalty, incidentally) wishes that the offending priest had instead been locked in a cell for the rest of his natural life with a burly prison rapist, while the latter is appalled that the prison system once again turned a blind eye to violence perpetrated against a sexual offender in prison.
In fact, I find Conover's point of view far more convincing. What disturbs me most about "Mr. PMS"' wish is that his willingness to impose a brutal punishment is utterly unaccompanied by a willingness to accept responsibility for it. As an opponent of capital punishment, he would surely be no readier to see the state hire an official state rapist than an official state executioner. And yet he's perfectly willing to have the state forcibly hand a prisoner over to a fellow criminal willing to do the same job on an unofficial basis.
Now, as my other stated views demonstrate, I am not necessarily opposed to the imposition of harsh punishments upon criminals. However, I believe that such punishments must be imposed directly, intentionally, and with a clearly understood purpose--not as mere revenge--and that society should be willing to bear the burden of responsibility for imposing them. For example, I believe that capital punishment, in the case of a certain class of murders, serves as both a powerful deterrent and an expression of the seriousness with which society reviles the crime of murder. And the use of stun belts to incapacitate unruly prisoners (another recently raised issue) strikes me--pace Jonathan Turley--as a legitimate, if unappetizing, means to effect the reasonable goal of preventing courtroom disruptions.
On the other hand, I'm not convinced that the threat of prison rape is much of a deterrent to sexual deviancy (or any other crime, for that matter). And I certainly don't believe that dodging responsibility for punishing criminals expresses society's seriousness about combatting crime.
If we believe that prison without inmate-on-inmate brutality is too cushy to be an effective deterrent (and in some cases, I suspect it might be), then I heartily encourage discussion of how to make it more unpleasant for convicts. However, effectively encouraging them to rape and murder each other would be very, very low down on my list of explicit techniques with which to engineer that outcome, and I doubt that many other people's lists would rank it much higher. Yet it seems to be the default technique of choice--mostly because society hasn't yet properly come to grips with the necessary ugliness that is punishment, and prefers to embrace a convenient psychological escape rather than face some harsh truths about controlling human evil.
In fact, I find Conover's point of view far more convincing. What disturbs me most about "Mr. PMS"' wish is that his willingness to impose a brutal punishment is utterly unaccompanied by a willingness to accept responsibility for it. As an opponent of capital punishment, he would surely be no readier to see the state hire an official state rapist than an official state executioner. And yet he's perfectly willing to have the state forcibly hand a prisoner over to a fellow criminal willing to do the same job on an unofficial basis.
Now, as my other stated views demonstrate, I am not necessarily opposed to the imposition of harsh punishments upon criminals. However, I believe that such punishments must be imposed directly, intentionally, and with a clearly understood purpose--not as mere revenge--and that society should be willing to bear the burden of responsibility for imposing them. For example, I believe that capital punishment, in the case of a certain class of murders, serves as both a powerful deterrent and an expression of the seriousness with which society reviles the crime of murder. And the use of stun belts to incapacitate unruly prisoners (another recently raised issue) strikes me--pace Jonathan Turley--as a legitimate, if unappetizing, means to effect the reasonable goal of preventing courtroom disruptions.
On the other hand, I'm not convinced that the threat of prison rape is much of a deterrent to sexual deviancy (or any other crime, for that matter). And I certainly don't believe that dodging responsibility for punishing criminals expresses society's seriousness about combatting crime.
If we believe that prison without inmate-on-inmate brutality is too cushy to be an effective deterrent (and in some cases, I suspect it might be), then I heartily encourage discussion of how to make it more unpleasant for convicts. However, effectively encouraging them to rape and murder each other would be very, very low down on my list of explicit techniques with which to engineer that outcome, and I doubt that many other people's lists would rank it much higher. Yet it seems to be the default technique of choice--mostly because society hasn't yet properly come to grips with the necessary ugliness that is punishment, and prefers to embrace a convenient psychological escape rather than face some harsh truths about controlling human evil.
Sunday, August 24, 2003
Not too long ago, it would have seemed highly improbable for a group blog of leftish transatlantic academics to angrily denounce as "odious" the claim that suicide bombings by Palestinian terrorists are justified. Indeed, a little more than a year ago, I remarked on the disturbingly widespread popularity, in certain circles (the pages of The Times and the British Prime Minister's family, to name two), of moral arguments in defense of suicide bombings.
Yet here is Crooked Timber's Chris Bertram passing just such a judgment on British philosopher Ted Honderich, whose "After the Terror" asserts that "probably a majority of humans who are half-informed or better, now at least find it difficult to deny" that "[s]uicide bombings by the Palestinians are right." Moreover, judging by the comments generated in response to Bertram's posting, moral embrace of Palestinian terrorism--though it has by no means disappeared as a position--appears by now to have sunk in popularity to the level where its opponents can forcefully condemn it with dismissive confidence.
It's hard to pinpoint any particular event that might have precipitated such a shift in educated opinion over the past year or so. In fact, I would argue that nothing of significance "on the ground" has changed between then and now. Rather, what we are observing is a kind of cumulative delayed effect from the Palestinian rejection of the Oslo accord, the subsequent three-year terrorism campaign, and Israel's (eventual) vigorous response.
Politically engaged people often find it difficult to abandon a political allegiance--whether to Soviet Marxism, Southern segregationism, or Palestinian nationalism--all at once, as soon as its moral credibility falls under suspicion. Rather, as developments render a particular political position more and more untenable, individual adherents tend at first to redouble their efforts to reconcile fealty to their cause with embrace of "mainstream" opinions. For example, few idealistic Soviet sympathizers heard about the show trials, or the Nazi-Soviet non-aggression pact, or Kruschev's "de-Stalinization" speech, or the invasion of Hungary or of Czechoslovakia, and immediately lost faith in a moment of sudden clarity. Rather, each of these events would have provoked at least some amount of self-doubt in many true believers, which was either eventually satisfactorily resolved or else inspired the slow development of a grudging disillusionment that ultimately led to a decisive break.
One characteristic of this process is increasing polarization, as the dissonance between loyalty to the cause and common sense or common decency grows sharper. Thus, those who continue to adhere to their political alignment are forced to grow, if anything, more extreme in their conviction, while those who defect often become vigorous critics of their former comrades. Perhaps that helps explain the bizarre moral obtuseness of a Ted Honderich, Matthew Parris or Cherie Blair.
It's also worth noting that Israelis themselves hardly abandoned faith in Oslo as soon as the violence broke out in September 2000. On the contrary, it was another year and a half--and hundreds of bloody deaths of terrorists' victims--later before the internal political consensus allowed prime minister Sharon to launch a serious military effort against the terrorist organizations. It's not surprising, then, that Western opinion is lagging behind Israel's in recognizing the ugly implications of the past three years of Palestinian terrorism.
Yet here is Crooked Timber's Chris Bertram passing just such a judgment on British philosopher Ted Honderich, whose "After the Terror" asserts that "probably a majority of humans who are half-informed or better, now at least find it difficult to deny" that "[s]uicide bombings by the Palestinians are right." Moreover, judging by the comments generated in response to Bertram's posting, moral embrace of Palestinian terrorism--though it has by no means disappeared as a position--appears by now to have sunk in popularity to the level where its opponents can forcefully condemn it with dismissive confidence.
It's hard to pinpoint any particular event that might have precipitated such a shift in educated opinion over the past year or so. In fact, I would argue that nothing of significance "on the ground" has changed between then and now. Rather, what we are observing is a kind of cumulative delayed effect from the Palestinian rejection of the Oslo accord, the subsequent three-year terrorism campaign, and Israel's (eventual) vigorous response.
Politically engaged people often find it difficult to abandon a political allegiance--whether to Soviet Marxism, Southern segregationism, or Palestinian nationalism--all at once, as soon as its moral credibility falls under suspicion. Rather, as developments render a particular political position more and more untenable, individual adherents tend at first to redouble their efforts to reconcile fealty to their cause with embrace of "mainstream" opinions. For example, few idealistic Soviet sympathizers heard about the show trials, or the Nazi-Soviet non-aggression pact, or Kruschev's "de-Stalinization" speech, or the invasion of Hungary or of Czechoslovakia, and immediately lost faith in a moment of sudden clarity. Rather, each of these events would have provoked at least some amount of self-doubt in many true believers, which was either eventually satisfactorily resolved or else inspired the slow development of a grudging disillusionment that ultimately led to a decisive break.
One characteristic of this process is increasing polarization, as the dissonance between loyalty to the cause and common sense or common decency grows sharper. Thus, those who continue to adhere to their political alignment are forced to grow, if anything, more extreme in their conviction, while those who defect often become vigorous critics of their former comrades. Perhaps that helps explain the bizarre moral obtuseness of a Ted Honderich, Matthew Parris or Cherie Blair.
It's also worth noting that Israelis themselves hardly abandoned faith in Oslo as soon as the violence broke out in September 2000. On the contrary, it was another year and a half--and hundreds of bloody deaths of terrorists' victims--later before the internal political consensus allowed prime minister Sharon to launch a serious military effort against the terrorist organizations. It's not surprising, then, that Western opinion is lagging behind Israel's in recognizing the ugly implications of the past three years of Palestinian terrorism.
Thursday, August 21, 2003
Just for the hell of it, you might want to consider asking your doctor if clomipramine is right for you.
Wednesday, August 20, 2003
It seems everyone--okay, Glenn "Instapundit" Reynolds, Oxblog's Josh Chafetz and Daniel Drezner, at least--is thrilled about The Guardian's new blog-based campaign to eliminate all agricultural subsidies. The left likes the idea because it helps third-world farmers against rich agribusiness conglomerates. Conservatives and libertarians like it because it creates a freer market in food. What's not to like?
Well, I certainly believe that first-world agricultural subsidies are excessive, but it's worth recalling why those subsidies are there in the first place. In effect, a bit of each person's food bill goes into an insurance fund that underwrites a massive oversupply of food. Food production, after all, is subject to the vagaries of the weather, and by paying for surpluses in times of plenty, we guarantee that the supply will be adequate even in the event of a farming catastrophe. Avoiding famine--with its attendant social disruption, not to mention widespread discomfort--is, I would think, well worth the price of a bit of subsidized overproduction.
The real problem with the current subsidy system is that it was founded at a time when international trade was far less voluminous and reliable than it is today. As a result, it tends to guarantee an agricultural surplus in each individual country separately, rather than simply ensuring that there will be a dependable global oversupply of food. It would make much more sense for the world's major industrialized food producers to ratchet down their subsidies to a point where global surpluses are still guaranteed, even if any one particular country may very occasionally become a net food importer. After all, in the event of a disastrous domestic crop failure in, say, France, the French would still be able to purchase enough food on a glutted world market to sustain themselves, barring some kind of collapse of international trade.
Unfortunately, third-world farmers benefit less than one might hope from such a globalized surplus regime, because their output is too unreliable to form a significant portion of the insured production quota. Or, to put it another way, it'd be nice to be able to steer business towards poor farmers in developing countries--but not if by doing so, we ended up actually relying on them for our food supply.
Still, their production ought to count for something--and that's better for them than the current system, in which most countries act as if domestic supplies (not to mention domestic political pressures) are all that matter.
Well, I certainly believe that first-world agricultural subsidies are excessive, but it's worth recalling why those subsidies are there in the first place. In effect, a bit of each person's food bill goes into an insurance fund that underwrites a massive oversupply of food. Food production, after all, is subject to the vagaries of the weather, and by paying for surpluses in times of plenty, we guarantee that the supply will be adequate even in the event of a farming catastrophe. Avoiding famine--with its attendant social disruption, not to mention widespread discomfort--is, I would think, well worth the price of a bit of subsidized overproduction.
The real problem with the current subsidy system is that it was founded at a time when international trade was far less voluminous and reliable than it is today. As a result, it tends to guarantee an agricultural surplus in each individual country separately, rather than simply ensuring that there will be a dependable global oversupply of food. It would make much more sense for the world's major industrialized food producers to ratchet down their subsidies to a point where global surpluses are still guaranteed, even if any one particular country may very occasionally become a net food importer. After all, in the event of a disastrous domestic crop failure in, say, France, the French would still be able to purchase enough food on a glutted world market to sustain themselves, barring some kind of collapse of international trade.
Unfortunately, third-world farmers benefit less than one might hope from such a globalized surplus regime, because their output is too unreliable to form a significant portion of the insured production quota. Or, to put it another way, it'd be nice to be able to steer business towards poor farmers in developing countries--but not if by doing so, we ended up actually relying on them for our food supply.
Still, their production ought to count for something--and that's better for them than the current system, in which most countries act as if domestic supplies (not to mention domestic political pressures) are all that matter.
Thursday, August 14, 2003
A Florida millionaire has apparently just been discovered to have led a "double life" for nearly 30 years, maintaining two simultaneous homes, marriages and families 20 miles apart in the Tampa area. The oddest part: the man's two lives were apparently remarkably similar, each with a lavish suburban home, an active society wife, children--and, of course, many "business-related" absences.
From the look of things, I'd say this fellow's just a bit unclear on the whole "double life" concept. Aldrich Ames led a double life. Clark Kent would be described as leading a double life. But what's the point of leading a double life, if both lives are going to be pretty much identical? Heck, maybe I can claim to lead a double life--it's just that my two identities happen to have the same name, live in the same modest apartment, lead the same quiet life, and hold down the same job.
Moreover, unlike this slipshod operator in Tampa, nobody will ever discover my carefully-concealed second life.
From the look of things, I'd say this fellow's just a bit unclear on the whole "double life" concept. Aldrich Ames led a double life. Clark Kent would be described as leading a double life. But what's the point of leading a double life, if both lives are going to be pretty much identical? Heck, maybe I can claim to lead a double life--it's just that my two identities happen to have the same name, live in the same modest apartment, lead the same quiet life, and hold down the same job.
Moreover, unlike this slipshod operator in Tampa, nobody will ever discover my carefully-concealed second life.
Tuesday, August 12, 2003
Matthew Yglesias provides yet another demonstration of the colossally messed-up state of the Great American Debate on Race. At issue is California's Proposition 54, a ballot initiative that would bar the government from collecting data that classifies individuals on the basis of race or ethnicity. Yglesias is vehemently opposed to the measure, describing it as a "really and truly awful idea. Really, really awful.....denying the government this data is going to make it totally impossible to do anything about discrimination or even to know whether or not it's taking place."
Now, it's hard to argue with the logic of Yglesias' argument. Race-based decisionmaking of any kind has been forbidden within the government of California since the passage of Proposition 209, but it's widely assumed that various arms of government are still surreptitiously applying racial preferences in areas such as hiring, college admissions and contracting. Without careful data-gathering, though, it will be extremely difficult to detect and expose such covert, illegal discrimination by government employees.
Ready for the crazy part? It's the opponents of racial preferences who support Proposition 54, fearing that the data will be used to impose such preferences, in direct contravention of California law. And it's the supporters of preferences who want to see the data gathered--even though it may help identify and root out those preferences that are still being implemented.
Commenters are encouraged to offer plausible explanations for this utterly bizarre state of affairs.
Now, it's hard to argue with the logic of Yglesias' argument. Race-based decisionmaking of any kind has been forbidden within the government of California since the passage of Proposition 209, but it's widely assumed that various arms of government are still surreptitiously applying racial preferences in areas such as hiring, college admissions and contracting. Without careful data-gathering, though, it will be extremely difficult to detect and expose such covert, illegal discrimination by government employees.
Ready for the crazy part? It's the opponents of racial preferences who support Proposition 54, fearing that the data will be used to impose such preferences, in direct contravention of California law. And it's the supporters of preferences who want to see the data gathered--even though it may help identify and root out those preferences that are still being implemented.
Commenters are encouraged to offer plausible explanations for this utterly bizarre state of affairs.
Monday, August 11, 2003
Eric Rescorla wonders why municipal workmen have been digging up his street (which has no noticeable need for repairs) every two weeks all summer.
It's a little-known fact that many of the engineers responsible for street design in major metropolitan areas got their start as "serious" artists before drifting into a more practical profession. As a result, they tend to be somewhat temperamental and perfectionistic, imagining each street as a unique masterpiece in progress, with its own distinct character and beauty. And, of course, they're constantly spotting slight changes or improvements that they'd like to make to each one, to "perfect" it.
Sometimes, they change their minds at the last moment, and cancel the "repair"; on other occasions, they reverse themselves afterwards, decide that the last alteration would have been better left undone, and schedule work to restore it. There's a major street not far from where I work that some sensitive soul has been continually "putting the finishing touches on" for years now. To the less aesthetically aware among us, it was just fine long ago--but I guess that's why he's an artist, and we're not. I just hope he's satisfied with it soon.
It's a little-known fact that many of the engineers responsible for street design in major metropolitan areas got their start as "serious" artists before drifting into a more practical profession. As a result, they tend to be somewhat temperamental and perfectionistic, imagining each street as a unique masterpiece in progress, with its own distinct character and beauty. And, of course, they're constantly spotting slight changes or improvements that they'd like to make to each one, to "perfect" it.
Sometimes, they change their minds at the last moment, and cancel the "repair"; on other occasions, they reverse themselves afterwards, decide that the last alteration would have been better left undone, and schedule work to restore it. There's a major street not far from where I work that some sensitive soul has been continually "putting the finishing touches on" for years now. To the less aesthetically aware among us, it was just fine long ago--but I guess that's why he's an artist, and we're not. I just hope he's satisfied with it soon.
Saturday, August 09, 2003
Israeli blogger Shai discusses a recent Israeli television documentary about second- and third-generation descendants of Holocaust survivors, now living in Israel, applying for German passports as a hedge against worsening conditions in the Middle East. Apparently, some Israelis view this embrace of Germany as a shameful betrayal of Zionism and the memory of the Holocaust. Others consider it admirably "normal" and "post-Zionist", and see no reason why an Israeli should hesitate to obtain a German passport--any more than would, say, any other citizen of the EU (whose passport is already a de facto German passport, and who probably can also point to his or her country's mistreatment at the hands of Germany during World War II).
Surprisingly, Shai doesn't mention a plausible third view of the act: as a perfectly rational, historically informed reaction to the Holocaust. For if the traditional Zionist lesson is that the Jewish people need a politically independent homeland to survive, an understandable alternative lesson is that the Jewish people, forever on the precarious brink of annihilation, cannot afford to turn their backs on any potential source of sanctuary, however improbable. Today's Israelis may well believe that "it can't happen here"--that the nation they call their own will defend them. But then, so did the Jews of prewar Germany.
It is precisely this idea, I suspect, that some Israelis find threatening and disloyal to Zionism. For it undermines the aforementioned Zionist principle that it is only a Jewish state that can protect the Jewish people. It also implies a weakening of Israelis' resolve to hold onto their country to the bitter end, and thus might perhaps encourage Palestinian terrorists to believe that their murderous deeds are having the desired effect: demoralizing Israelis to the point of military collapse and mass population flight.
But the Jews, of all people, should have learned by now the necessity of not averting one's eyes from even the harshest, most painful of possible outcomes. Israelis who secure a secondary haven for themselves in Germany should their own home country become unsafe are still displaying far more national loyalty, after all, than the thousands who actually move abroad, to the US or elsewhere. And if--heaven forbid--the worst should happen, the very survival of the Jewish people could one day rest on the shoulders of those Jews who sought shelter in seemingly unlikely places.
Surprisingly, Shai doesn't mention a plausible third view of the act: as a perfectly rational, historically informed reaction to the Holocaust. For if the traditional Zionist lesson is that the Jewish people need a politically independent homeland to survive, an understandable alternative lesson is that the Jewish people, forever on the precarious brink of annihilation, cannot afford to turn their backs on any potential source of sanctuary, however improbable. Today's Israelis may well believe that "it can't happen here"--that the nation they call their own will defend them. But then, so did the Jews of prewar Germany.
It is precisely this idea, I suspect, that some Israelis find threatening and disloyal to Zionism. For it undermines the aforementioned Zionist principle that it is only a Jewish state that can protect the Jewish people. It also implies a weakening of Israelis' resolve to hold onto their country to the bitter end, and thus might perhaps encourage Palestinian terrorists to believe that their murderous deeds are having the desired effect: demoralizing Israelis to the point of military collapse and mass population flight.
But the Jews, of all people, should have learned by now the necessity of not averting one's eyes from even the harshest, most painful of possible outcomes. Israelis who secure a secondary haven for themselves in Germany should their own home country become unsafe are still displaying far more national loyalty, after all, than the thousands who actually move abroad, to the US or elsewhere. And if--heaven forbid--the worst should happen, the very survival of the Jewish people could one day rest on the shoulders of those Jews who sought shelter in seemingly unlikely places.
Subscribe to:
Posts (Atom)