Michael Kinsley's analysis of the philosophical problems of 9/11 victim compensation gets one thing right, at least: as a mechanism for caring for the unfortunate, the American system of compensation-by-tort-lawsuit is an absurd travesty. If one were to design a national scheme for mitigating the suffering caused by accidents, injuries and illnesses, one might choose to base it primarily on private insurance, government payments, employer benefits, charitable institutions, or any combination of these or other approaches. But no sane person would ever dream of suggesting that a reasonable way to provide for a helpless victim is to have a lawyer first find some wealthy third party at least vaguely associated with (though hardly culpable for) the victim's helplessness, and then convince twelve other people that said wealthy party should bear the costs of caring for the helpless victim--tacking on another fifty percent for the lawyer. Nearly any other arrangement would be preferable.
In evaluating alternatives, however, Kinsley goes very, very wrong. His mistake begins with the very title of his essay, in which he uses the word "justice" in the context of the suffering that life inevitably metes out to those of us both lucky and unlucky enough to be born. To Kinsley, "justice in specific"--say, charitable donations to help the families of 9/11 victims--is a weak substitute for "justice in general", such as universal health care, or even the aforementioned "deep pockets" lawsuits. "Would you voluntarily exchange your beloved spouse for a $2 million check?", he asks. "If your answer is no," he continues, "then the widow is still undercompensated for her loss (and McDonald's, although blameless, needs the $2 million less than she probably does)." A previous era's pundit might have put it a little more succinctly: "From each according to his ability, to each according to his need."
Now, I do not believe (*pace* the capitalist ideologues) that redistributionism is fundamentally immoral; when we as a society manage to generate enough wealth that we can consider diverting some of it towards assisting the unfortunate, we are generally (and rightly) pleased to do so. But the notion of entitlement--that the institutionalized automatic satiation of certain needs is simple justice--rests on a deeply flawed premise: that human needs are of fixed dimension, and can be identified and "taken care of" in turn by a well-run society. In reality, the needs of humankind are infinite and eternal, and there is no hope of ever meeting any of them in full.
Moreover, each time it has been assumed that we can, the result has been nothing short of disastrous. One need not invoke the totalitarian horrors of (supposedly) equality-enforcing communist regimes to observe the pernicious effects of entitlement-based thinking; the spectacular success of the 1996 welfare reform bill in the U.S. (and the decay that preceded it) should be evidence enough that the ethic of assistance outperforms the ethic of entitlement even in a free, prosperous society. In a world of finite need, the declaration of a particular material appetite as a "right" would "solve the problem" of sating it; but in our world of limitless need, as we have seen, entitlement merely whets appetites further, while reducing incentives to work towards satisfying them.
The events of September 11th triggered a magnificent outpouring of generosity--both personal and collective, i.e., political--among Americans that will ultimately provide assistance to many thousands of survivors of that day's tragedy. Those survivors will certainly not be made whole, nor (as Kinsley points out) will the millions whose suffering happens to have nothing to do with terrorism. But a lot of good will still have been done, and a lot of people will be grateful for it. For people like Kinsley who believe that the only legitimate social goal is the erasure of general wants, this easing of the particular pain of a particular few is an unconscionable unfairness to the rest. But to those of us who know that sorrow eased by charity is a blessing that will always be dwarfed by sorrow unrelieved, Kinsley's pooh-poohing of what he calls "justice in specific" is the very worst kind of ungenerosity.
Friday, March 29, 2002
Wednesday, March 27, 2002
Instapundit (and law professor) Glenn Reynolds is upset that a female law student at the University of Virginia is apparently suing her male torts prof over a touch on the shoulder (a "caress", she claims) which supposedly caused her enormous emotional distress. According to Reynolds, "The 'touch' was part of teaching an old chestnut of a case called Vosburg v. Putney, in which an elementary student kicks another, freakishly inflicts serious injury, and is sued for the severe consequences of a minor wrongful act."
Now, it's perfectly understandable that law professors would be enraged at her cheek; you can practically hear them thinking to themselves, "no, no, no--we're teaching you to do that to other people, for heaven's sake, not to us!". Reynolds advocates "a scorched-earth response to this frivolous lawsuit, ensuring that the student and her lawyer would deeply regret instituting it", and wonders if "law professors around the country will band together and file a class-action suit against the plaintiff and her lawyer, arguing that this frivolous suit endangers the atmosphere in classrooms across America". Boy, that'll show those disrespectful youngsters they still have a thing or two to learn from the old pros!
Personally, I think that Reynolds and his colleagues, in their defensiveness, are missing out on a spectacular pedagogic opportunity. The law is, after all, a practical profession. Doctors train in part by doing; why not lawyers as well? Here's a concrete proposal: On the first day of torts class, every law student in America should be told, "you must pass 'torts' to get a law degree. And we intend to fail every last one of you, unless it would be illegal or financially ruinous for us to do so. One more thing: it is forbidden for any classmate or member of the bar help you pass." The failing grade in torts would stand for each student until that student, by his or her own legal efforts, managed to compel the university, through the courts, to change it.
Not every student will have the necessary creativity, gumption and legal ability to sue his or her way into a passing grade, of course. But then, not everyone has what it takes to be a lawyer....
Now, it's perfectly understandable that law professors would be enraged at her cheek; you can practically hear them thinking to themselves, "no, no, no--we're teaching you to do that to other people, for heaven's sake, not to us!". Reynolds advocates "a scorched-earth response to this frivolous lawsuit, ensuring that the student and her lawyer would deeply regret instituting it", and wonders if "law professors around the country will band together and file a class-action suit against the plaintiff and her lawyer, arguing that this frivolous suit endangers the atmosphere in classrooms across America". Boy, that'll show those disrespectful youngsters they still have a thing or two to learn from the old pros!
Personally, I think that Reynolds and his colleagues, in their defensiveness, are missing out on a spectacular pedagogic opportunity. The law is, after all, a practical profession. Doctors train in part by doing; why not lawyers as well? Here's a concrete proposal: On the first day of torts class, every law student in America should be told, "you must pass 'torts' to get a law degree. And we intend to fail every last one of you, unless it would be illegal or financially ruinous for us to do so. One more thing: it is forbidden for any classmate or member of the bar help you pass." The failing grade in torts would stand for each student until that student, by his or her own legal efforts, managed to compel the university, through the courts, to change it.
Not every student will have the necessary creativity, gumption and legal ability to sue his or her way into a passing grade, of course. But then, not everyone has what it takes to be a lawyer....
Tuesday, March 19, 2002
One thing is clear from Dahlia Lithwick's recent little tirade in Slate: she really, really doesn't like drug-testing in high schools. She's entitled to her opinion, of course; but as a lawyer writing about a Supreme Court hearing, she should at least try not to let her snide interjections interfere with a clearheaded discussion of the relevant legal issues.
For example, it is self-evidently not true that, as she puts it, "[t]he high court wants your urine, and they have the power to take it." On the contrary, various boards of education have decided to conduct drug tests on their extracurricular-activity-bound high-school students, and it is not the Supreme Court's place to second-guess the drug-testing policies of these democratically elected governmental bodies--unless, of course, they violate the Constitution. The question at issue, therefore, isn't how much Dahlia Lithwick or the ACLU or anybody else hates drug-testing, but rather whether the Constitution (specifically, the Fourth Amendment) prohibits mandatory urine tests of these high-school students as "unreasonable searches and seizures". And there is one word that never appears in Ms. Lithwick's screed that, had she mentioned it, would have exposed the hypocrisy of the Fourth Amendment case against drug-testing (and, indeed, of most modern Fourth Amendment jurisprudence). That word is "steroids".
Urine tests to detect steroids are commonplace in sporting events, including many high-school competitions. As far as I know, no specifically steroids-targeted testing program has ever even been challenged in court on Fourth Amendment grounds--most probably for the simple reason that the legal establishment happens to agree with the educational establishment's opposition to steroid use, whereas decriminalizing drug use is something of a cause celebre among civil liberties-minded lawyers. (The brief for the anti-testing side in some of the relevant cases explicitly points out that steroids were not among the drugs tested for--presumably because including steroids would have made the tests more "legitimate".) As a matter of Constitutional principle, however, a urine test for steroids is indistinguishable from a urine test for other drugs; viewed strictly as "searches and seizures," they are identical. A distinction between the two can only be based on arguments about the relative importance to schools (or to society at large) of detecting and discouraging steroid use, compared to, say, marijuana use, among high-schoolers.
Now, reasonable people can certainly differ on this issue, as well as on other governmental search-related matters (such as whether drunk driving is enough of a threat to merit random breathalyzer tests, or--to pick a topical example--whether suicide hijackings are enough of a threat to justify mandatory highly intrusive luggage and body searches at airports). But whatever one's opinion in each case, these are all clearly proper questions for public policy debate, subject to individual views and preferences regarding societal values and priorities. And in any sane democracy, they would be decided democratically, by legislators and executives accountable and responsible to the people (Ms. Lithwick and her lawyer friends no more or less than anyone else)--not by the arbitrary whims of nine lifetime-appointed judges engaging in elaborately creative exegeses of Constitutional scripture.
But I keep forgetting: this is America.
For example, it is self-evidently not true that, as she puts it, "[t]he high court wants your urine, and they have the power to take it." On the contrary, various boards of education have decided to conduct drug tests on their extracurricular-activity-bound high-school students, and it is not the Supreme Court's place to second-guess the drug-testing policies of these democratically elected governmental bodies--unless, of course, they violate the Constitution. The question at issue, therefore, isn't how much Dahlia Lithwick or the ACLU or anybody else hates drug-testing, but rather whether the Constitution (specifically, the Fourth Amendment) prohibits mandatory urine tests of these high-school students as "unreasonable searches and seizures". And there is one word that never appears in Ms. Lithwick's screed that, had she mentioned it, would have exposed the hypocrisy of the Fourth Amendment case against drug-testing (and, indeed, of most modern Fourth Amendment jurisprudence). That word is "steroids".
Urine tests to detect steroids are commonplace in sporting events, including many high-school competitions. As far as I know, no specifically steroids-targeted testing program has ever even been challenged in court on Fourth Amendment grounds--most probably for the simple reason that the legal establishment happens to agree with the educational establishment's opposition to steroid use, whereas decriminalizing drug use is something of a cause celebre among civil liberties-minded lawyers. (The brief for the anti-testing side in some of the relevant cases explicitly points out that steroids were not among the drugs tested for--presumably because including steroids would have made the tests more "legitimate".) As a matter of Constitutional principle, however, a urine test for steroids is indistinguishable from a urine test for other drugs; viewed strictly as "searches and seizures," they are identical. A distinction between the two can only be based on arguments about the relative importance to schools (or to society at large) of detecting and discouraging steroid use, compared to, say, marijuana use, among high-schoolers.
Now, reasonable people can certainly differ on this issue, as well as on other governmental search-related matters (such as whether drunk driving is enough of a threat to merit random breathalyzer tests, or--to pick a topical example--whether suicide hijackings are enough of a threat to justify mandatory highly intrusive luggage and body searches at airports). But whatever one's opinion in each case, these are all clearly proper questions for public policy debate, subject to individual views and preferences regarding societal values and priorities. And in any sane democracy, they would be decided democratically, by legislators and executives accountable and responsible to the people (Ms. Lithwick and her lawyer friends no more or less than anyone else)--not by the arbitrary whims of nine lifetime-appointed judges engaging in elaborately creative exegeses of Constitutional scripture.
But I keep forgetting: this is America.
Monday, March 18, 2002
Could spam email be at the root of Muslim anger toward America? Think about it--a steady stream of annoying offers of pornography, herbal remedies, and get-rich-quick schemes from merciless American marketers could easily have driven harried Middle Eastern email users such as Al Qaida (we know they used email routinely) to launch their terrorist campaign against the U.S. President Bush needs to act now to eliminate this source of international tension before the "Arab street" finally explodes in rage over America's spamming arrogance, with deadly consequences for the entire civilized world.
Okay, I admit it--spam may bother me a lot, but it probably doesn't weigh heavily on the minds of Osama bin Laden's followers. Projecting one's own gripes onto wild-eyed Islamic fanatics is a popular game these days, though. Take Thomas Friedman's latest proposal for addressing the "why do they hate us?" issue. "Since Sept. 11," writes Friedman, "President Bush has often noted that the world has fundamentally changed. Yet, time after time, he has exploited the shock of Sept. 11 to argue why his same old, pre-Sept. 11 policies were still the only way to proceed — only more so." Friedman, a much wiser man than Bush, naturally argues instead for his same old, pre-Sept. 11 policies: a new "Marshall Plan"-style foreign aid program, vigorous promotion of free trade, pressure on allies to enact democratic reforms, and ratification of the Kyoto environmental protocols.
Now, whatever one's opinion about, say, the supposed urgency of reducing greenhouse gas emissions, the notion that doing so might have any palliative effect on the anti-Americanism of fanatical Muslims, or of their more moderate brethren, or of anybody outside Thomas Friedman's circle of globetrotting chatterers (and even they are far from a sure bet) is pure fantasy. Muslim nations are for the most part poor, undemocratic breeding grounds for Islamic militancy, to be sure, but neither free trade nor foreign aid nor even encouragement of democracy will have much more than a cosmetic effect on the mammoth complex of thorny problems bedevilling these places. On the other hand, some of the poorest, most benighted, most dictatorially brutal places on earth pose not the slightest threat to the safety of Americans at home. A sensible anti-terrorism policy would focus on what makes Egypt and Saudi Arabia more dangerous to America than, say, Congo or Myanmar (maybe it's the carefully controlled, virulently anti-American press, for instance? The fanatically anti-Western clergy and their power over the education system?), not on what makes them less livable than, say, Switzerland.
Of course, if you're a self-absorbed globetrotting journalist who has to drag yourself to visit the world's hotspots, hobnobbing with the local chattering classes, then the latter question is obviously a much more pressing concern.....
Okay, I admit it--spam may bother me a lot, but it probably doesn't weigh heavily on the minds of Osama bin Laden's followers. Projecting one's own gripes onto wild-eyed Islamic fanatics is a popular game these days, though. Take Thomas Friedman's latest proposal for addressing the "why do they hate us?" issue. "Since Sept. 11," writes Friedman, "President Bush has often noted that the world has fundamentally changed. Yet, time after time, he has exploited the shock of Sept. 11 to argue why his same old, pre-Sept. 11 policies were still the only way to proceed — only more so." Friedman, a much wiser man than Bush, naturally argues instead for his same old, pre-Sept. 11 policies: a new "Marshall Plan"-style foreign aid program, vigorous promotion of free trade, pressure on allies to enact democratic reforms, and ratification of the Kyoto environmental protocols.
Now, whatever one's opinion about, say, the supposed urgency of reducing greenhouse gas emissions, the notion that doing so might have any palliative effect on the anti-Americanism of fanatical Muslims, or of their more moderate brethren, or of anybody outside Thomas Friedman's circle of globetrotting chatterers (and even they are far from a sure bet) is pure fantasy. Muslim nations are for the most part poor, undemocratic breeding grounds for Islamic militancy, to be sure, but neither free trade nor foreign aid nor even encouragement of democracy will have much more than a cosmetic effect on the mammoth complex of thorny problems bedevilling these places. On the other hand, some of the poorest, most benighted, most dictatorially brutal places on earth pose not the slightest threat to the safety of Americans at home. A sensible anti-terrorism policy would focus on what makes Egypt and Saudi Arabia more dangerous to America than, say, Congo or Myanmar (maybe it's the carefully controlled, virulently anti-American press, for instance? The fanatically anti-Western clergy and their power over the education system?), not on what makes them less livable than, say, Switzerland.
Of course, if you're a self-absorbed globetrotting journalist who has to drag yourself to visit the world's hotspots, hobnobbing with the local chattering classes, then the latter question is obviously a much more pressing concern.....
Saturday, March 16, 2002
How do diehard opponents of Israel maintain sympathy for the Palestinian cause, now that Palestinians en masse have rejected Israel's offer of peace, independence and statehood (at Camp David in 2000) in favor of a brutal 18-month campaign of anti-Jewish terrorism? One popular reality-denying rhetorical technique is to invoke the concept of "humiliation". Unlike other grievances, such as civilian death tolls, political freedoms, or publicly stated objectives, humiliation is not objectively measurable, and can therefore be assigned arbitrary strengths, causes and consequences. Before Camp David, for instance, one might have expected the Israeli offer of Palestinian statehood on 90+% of the occupied territories to have been welcomed by a people chafing under occupation (and its rejection therefore to be proof positive that the PA is at war with Israel, not with its occupation). These days, however, pro-Palestinian commentators refer routinely to the "humiliating" offer at Camp David--thus reconciling the supposed anti-occupation motivation for its rejection with facts that plainly contradict it. (The Camp David offer gave them virtually everything they wanted, goes the argument--but the few tiny flaws in it were so "humiliating" that even so, they couldn't accept it, and that's why they're behaving for all the world as though they never wanted to make peace with Israel in the first place.)
Consider, for example, a New York Times editorial about the recent Israeli military incursion into towns and refugee camps in the West Bank and the Gaza strip. The editorial asserts that "this kind of extended military operation is unacceptable", and that "Israel must cut way back on its use of force", but its only concrete objections to the operation involve the treatment of detainees (a minor postscript to the entire operation), and the fact that "hard-core terrorists from Hamas and other groups appear to have slipped away" (a fact that presumably argues for a bigger military operation, not a smaller one, if the terrorists are to be trapped and apprehended next time). The editorial writer's real fear, of course, is that Israel might "deepen the level of Palestinian anger", thus jeopardizing "a future accord based on peaceful coexistence between the two peoples". The source of this anger? The "humiliation" imposed on the Palestinians by Israeli counterterrorist actions. (The editorial refers to it no fewer than three times.)
One might think that PA repression and corruption would be appallingly humiliating to Palestinians, along with joblessness, lack of economic opportunity, notorious terrorist groups massacring civilians in their name, or dozens of other burdens that can't be traced to Israel at all. But if the New York Times wants to believe that Israel's occupation of the West Bank and Gaza is the real source of Palestinian humiliation, and hence of Palestinian terrorism, then it can easily portray the entire sordid record of Palestinian rejectionism, in word and deed, as a story of innocent victimhood--whatever the facts.
Palestinians (including so-called "moderates" such as the late Faisal Husseini) calling for the elimination of Israel? Just rhetorical retaliation to the humiliation of the occupation. Terrorists, with the full blessing of both the PA and the populace, murdering Jewish women and children at every opportunity? What can you expect, given the humiliation of the occupation? The PA smuggling long-range weapons useful only for attacking across a defended border? Well, a demilitarized state would be humiliating. The PA reneging on every negotiated agreement, and refusing to accept a generous final settlement? It was a humiliating offer, and after years of humiliating occupation, Palestinians were too bitter to accept a peaceful end to it. Humiliation explains all, excuses all, rationalizes all. It's an explanation that's fully consistent with any reality--and thus perfect for those who wish to ignore reality altogether.
Consider, for example, a New York Times editorial about the recent Israeli military incursion into towns and refugee camps in the West Bank and the Gaza strip. The editorial asserts that "this kind of extended military operation is unacceptable", and that "Israel must cut way back on its use of force", but its only concrete objections to the operation involve the treatment of detainees (a minor postscript to the entire operation), and the fact that "hard-core terrorists from Hamas and other groups appear to have slipped away" (a fact that presumably argues for a bigger military operation, not a smaller one, if the terrorists are to be trapped and apprehended next time). The editorial writer's real fear, of course, is that Israel might "deepen the level of Palestinian anger", thus jeopardizing "a future accord based on peaceful coexistence between the two peoples". The source of this anger? The "humiliation" imposed on the Palestinians by Israeli counterterrorist actions. (The editorial refers to it no fewer than three times.)
One might think that PA repression and corruption would be appallingly humiliating to Palestinians, along with joblessness, lack of economic opportunity, notorious terrorist groups massacring civilians in their name, or dozens of other burdens that can't be traced to Israel at all. But if the New York Times wants to believe that Israel's occupation of the West Bank and Gaza is the real source of Palestinian humiliation, and hence of Palestinian terrorism, then it can easily portray the entire sordid record of Palestinian rejectionism, in word and deed, as a story of innocent victimhood--whatever the facts.
Palestinians (including so-called "moderates" such as the late Faisal Husseini) calling for the elimination of Israel? Just rhetorical retaliation to the humiliation of the occupation. Terrorists, with the full blessing of both the PA and the populace, murdering Jewish women and children at every opportunity? What can you expect, given the humiliation of the occupation? The PA smuggling long-range weapons useful only for attacking across a defended border? Well, a demilitarized state would be humiliating. The PA reneging on every negotiated agreement, and refusing to accept a generous final settlement? It was a humiliating offer, and after years of humiliating occupation, Palestinians were too bitter to accept a peaceful end to it. Humiliation explains all, excuses all, rationalizes all. It's an explanation that's fully consistent with any reality--and thus perfect for those who wish to ignore reality altogether.
Wednesday, March 13, 2002
Why are women like Andrea Yates who kill their children often assumed to be mentally ill, while men like Adair Garcia who do the same are perceived as merely evil? The traditional answer, of course, is that the maternal bond is so powerful that only insanity can overcome it, whereas paternal love is weaker and more fragile, and can be countered by mere rage or malice. Slate's Dahlia Lithwick, however, has an alternate explanation: "Men are disproportionately jailed for filicide not because they are more evil than women but because we believe they have harmed a woman's property—as opposed to their own." Hence her political program: "It would, of course, help if we could stop thinking of children as anyone's property....It will do a good deal to advance the cause of children's rights if we begin to consider them as legal entities in and of themselves."
Ms. Lithwick is hardly the first to disparage a traditional virtue like maternal protectiveness by redefining it as an expression of ownership. Back in the 1960's, feminists such as Kate Millet began to deride the male ideal of duty and commitment to family as merely one component of an oppressive system of "ownership" of women and children. Previously, male identity had been so strongly associated with the role of provider that husbands and fathers would endure incredible hardship rather than admit (say, by allowing their wives to work outside the home) that they had failed in their responsibilities. (It seems inconceivable in this era of absent fathers and hard-pressed working mothers, but the architects of New Deal poverty relief efforts concentrated on jobs programs, rather than welfare payments, in part because a great many husbands and fathers would simply have refused a government "handout" as an insult to their status as family breadwinner.) But where some saw admirable commitment and solicitude, Millet and her cohort saw only tyranny. By placing the burden of financial support for the family firmly on the father's shoulders, the early feminists argued, society was in fact treating mothers and children as powerless subordinates unable (and hence unentitled) to fend for themselves.
At around the same time, the widespread availability of the birth control pill began to erode the connection between sex and the obligations of parenthood. Seizing the opportunity, advocates of sexual libertinism, including feminists such as Germaine Greer, began portraying the traditional obligations of lifelong marital fidelity as mere "possessiveness", interfering needlessly with every consenting adult's right to recreational sex. Forward-thinking couples, recoiling from the associations being made between commitment and ownership, began to eschew marriage, divorcing more readily and entering into more casual, less constrained--and less long-lived--sexual relationships. Over the past thirty years, the consequences of these twin revolutions against the constraints of traditional family structure have been unmistakeable: they have succeeded in effecting nothing less than the disintegration of the traditional family, and the mass impoverishment of abandoned mothers and children.
Of course, the feminist supporters of these social changes were not concerned primarily with relieving irresponsible men of their marital obligations. Rather, they were advocating on behalf of millions of intelligent, self-confident, ambitious women--the Dahlia Lithwicks of the world--who chafed under the traditional female roles of lifelong wife and mother. Many of these women have since found enormous joy and fulfillment (not to mention wealth and power) by exercising their talents and options to the fullest in both the vocational and sexual spheres. And since they are disproportionately wealthy, powerful and influential, they form a vocal, articulate (and not entirely unsympathetic) constituency militating against any rollback to traditional notions of familial duty.
But it's hard to find any likely beneficiaries of Ms. Lithwick's proposed extension of the same thinking to children. Where are the intelligent, self-confident, ambitious youngsters held back by their status as their mothers' "property", and aching to exercise their "children's rights" to maximize their individual potential? And would freeing mothers from their solemn commitment to care for these newly empowered, implicitly independent "legal entities" really improve children's well-being? Who, apart from a whole host of lawyers--the Dahlia Lithwicks of the world, again--could possibly benefit from a massive legal intervention to restrict and hamstring (of all things) motherhood, in the name of a concept as dubious as "children's rights"?
Ms. Lithwick is hardly the first to disparage a traditional virtue like maternal protectiveness by redefining it as an expression of ownership. Back in the 1960's, feminists such as Kate Millet began to deride the male ideal of duty and commitment to family as merely one component of an oppressive system of "ownership" of women and children. Previously, male identity had been so strongly associated with the role of provider that husbands and fathers would endure incredible hardship rather than admit (say, by allowing their wives to work outside the home) that they had failed in their responsibilities. (It seems inconceivable in this era of absent fathers and hard-pressed working mothers, but the architects of New Deal poverty relief efforts concentrated on jobs programs, rather than welfare payments, in part because a great many husbands and fathers would simply have refused a government "handout" as an insult to their status as family breadwinner.) But where some saw admirable commitment and solicitude, Millet and her cohort saw only tyranny. By placing the burden of financial support for the family firmly on the father's shoulders, the early feminists argued, society was in fact treating mothers and children as powerless subordinates unable (and hence unentitled) to fend for themselves.
At around the same time, the widespread availability of the birth control pill began to erode the connection between sex and the obligations of parenthood. Seizing the opportunity, advocates of sexual libertinism, including feminists such as Germaine Greer, began portraying the traditional obligations of lifelong marital fidelity as mere "possessiveness", interfering needlessly with every consenting adult's right to recreational sex. Forward-thinking couples, recoiling from the associations being made between commitment and ownership, began to eschew marriage, divorcing more readily and entering into more casual, less constrained--and less long-lived--sexual relationships. Over the past thirty years, the consequences of these twin revolutions against the constraints of traditional family structure have been unmistakeable: they have succeeded in effecting nothing less than the disintegration of the traditional family, and the mass impoverishment of abandoned mothers and children.
Of course, the feminist supporters of these social changes were not concerned primarily with relieving irresponsible men of their marital obligations. Rather, they were advocating on behalf of millions of intelligent, self-confident, ambitious women--the Dahlia Lithwicks of the world--who chafed under the traditional female roles of lifelong wife and mother. Many of these women have since found enormous joy and fulfillment (not to mention wealth and power) by exercising their talents and options to the fullest in both the vocational and sexual spheres. And since they are disproportionately wealthy, powerful and influential, they form a vocal, articulate (and not entirely unsympathetic) constituency militating against any rollback to traditional notions of familial duty.
But it's hard to find any likely beneficiaries of Ms. Lithwick's proposed extension of the same thinking to children. Where are the intelligent, self-confident, ambitious youngsters held back by their status as their mothers' "property", and aching to exercise their "children's rights" to maximize their individual potential? And would freeing mothers from their solemn commitment to care for these newly empowered, implicitly independent "legal entities" really improve children's well-being? Who, apart from a whole host of lawyers--the Dahlia Lithwicks of the world, again--could possibly benefit from a massive legal intervention to restrict and hamstring (of all things) motherhood, in the name of a concept as dubious as "children's rights"?
Sunday, March 10, 2002
Nothing gets elite journalists steamed quite like the suggestion that maybe they aren't necessarily as important as they believe themselves to be. So last week, when it was revealed that ABC was negotiating to take away Ted Koppel's "Nightline" slot and give it to, of all people, David Letterman, Koppel's colleagues came out in force to rail at the ignominy of it all. In the Washington Post, Tom Rosenstiel and Bill Kovach harrumphed loudly, as did E.J. Dionne; on C-SPAN, a panel of distinguished journalists led by the Robertses (Cokie and Steve) commiserated; and in the New York Times, Koppel himself weighed in. To a person, they all insisted that it wasn't about the money, as they say; rather, what stung was an anonymous Disney executive's claim that the "relevancy of 'Nightline' is just not there anymore." Apparently, the offended parties, lacking reading skills, misread the word "Nightline" as "journalism", and hit the roof. "Do companies that own news divisions think profit trumps every other value?", fumed Dionne. "What Disney executives are really arguing is that journalism is just another kind of content; that communication is communication," complained Rosenstiel and Kovach.
But the specific case against Nightline's relevance is easy to make. In 1979, when it first aired as a nighly update on the Iranian hostage crisis, fewer than twenty percent of American television-owning households subscribed to a cable service, and CNN was a mere gleam in Ted Turner's eye. An extra "hard" news source--a late-night bulletin on the main story of the day--was thus a valuable (and popular) addition to the schedule. Today, on the other hand, more than two-thirds of American households have cable, giving them access to at least one of the several 24-hour all-news channels--and for equally fresh and more detailed reports, they can always go online to the Websites of literally thousands of news outlets based around the world. In 2002, waiting for Ted's 11:30PM wrapup (or for that matter, the 6:30 nightly broadcast) just doesn't make much sense anymore.
The old broadcast networks (like newspapers before them) have responded to their news divisions' creeping obsolescence by shifting to "softer" coverage (magazine-format shows with "human-interest" and "investigative"--i.e., more dramatic, less objective--stories), and Nightline is no exception. In the aftermath of September 11th, Nightline producers wisely recognized that stale summaries of the day's terrorism-related events would be of little use to its long-ago-updated viewers, and instead concentrated on "background and analysis" thumbsuckers (a broadcast entitled, "Lessons from the War on Drugs", for example) and controversy-stirring sympathetic airings of anti-American views (one title: "Why Do They Hate Us?"). Recently, Nightline proudly presented an acclaimed weeklong series that had been pre-empted by September 11th; it depicted current conditions in the Congo, and consisted largely of a parade of heart-rending stories of individual Congolese victims of the brutal violence that has engulfed the region for many years. And of course there's the show's long-running sporadic series, "America in Black and White"; this past week's installment, in a daring departure for modern television journalism, presented allegations of discrimination against African-Americans in the criminal justice system. None of this tired, formulaic content would have any trouble finding an outlet on the numerous cheap-to-produce "news magazines" that have proliferated in recent years on the evening schedules of the "big three" networks.
Why, then, are journalists, despite the ever-increasing range of news vehicles, getting all upset over the possible cancellation of one very old-school late-night news program? Well, it's unlikely that anyone was particularly horrified over at, say CNN Headline News--an organization that presumably understands and accepts its role as a service provider to its audience. But the aging elite of the profession see themselves more as a species of avant-garde artsts, and imagine, after decades of coddling by the old broadcasting oligopoly, that they are in fact keepers of the journalistic-cultural flame, producing masterpieces that an ignorant general public will never understand or appreciate, but that future generations will revere as brilliant insights into life, truth and society. To them, ABC's attack on Nightline is like an NEA funding cut--a direct assault not only on their financial security, but on their very self-image as "free spirits" entitled to subsidy merely for exercising their inherently valuable creative/journalistic muse. (Steve Roberts, on C-SPAN, emphasized the importance of NPR as a repository of "independent" journalism in the age of corporate control.)
Fortunately, their private-sector paymasters are more agile (and more cash-conscious) than the federal government, and are unlikely to continue to fund their anachronistic rituals much longer, now that all but the elderly among their once-captive audience have long since tuned out.
But the specific case against Nightline's relevance is easy to make. In 1979, when it first aired as a nighly update on the Iranian hostage crisis, fewer than twenty percent of American television-owning households subscribed to a cable service, and CNN was a mere gleam in Ted Turner's eye. An extra "hard" news source--a late-night bulletin on the main story of the day--was thus a valuable (and popular) addition to the schedule. Today, on the other hand, more than two-thirds of American households have cable, giving them access to at least one of the several 24-hour all-news channels--and for equally fresh and more detailed reports, they can always go online to the Websites of literally thousands of news outlets based around the world. In 2002, waiting for Ted's 11:30PM wrapup (or for that matter, the 6:30 nightly broadcast) just doesn't make much sense anymore.
The old broadcast networks (like newspapers before them) have responded to their news divisions' creeping obsolescence by shifting to "softer" coverage (magazine-format shows with "human-interest" and "investigative"--i.e., more dramatic, less objective--stories), and Nightline is no exception. In the aftermath of September 11th, Nightline producers wisely recognized that stale summaries of the day's terrorism-related events would be of little use to its long-ago-updated viewers, and instead concentrated on "background and analysis" thumbsuckers (a broadcast entitled, "Lessons from the War on Drugs", for example) and controversy-stirring sympathetic airings of anti-American views (one title: "Why Do They Hate Us?"). Recently, Nightline proudly presented an acclaimed weeklong series that had been pre-empted by September 11th; it depicted current conditions in the Congo, and consisted largely of a parade of heart-rending stories of individual Congolese victims of the brutal violence that has engulfed the region for many years. And of course there's the show's long-running sporadic series, "America in Black and White"; this past week's installment, in a daring departure for modern television journalism, presented allegations of discrimination against African-Americans in the criminal justice system. None of this tired, formulaic content would have any trouble finding an outlet on the numerous cheap-to-produce "news magazines" that have proliferated in recent years on the evening schedules of the "big three" networks.
Why, then, are journalists, despite the ever-increasing range of news vehicles, getting all upset over the possible cancellation of one very old-school late-night news program? Well, it's unlikely that anyone was particularly horrified over at, say CNN Headline News--an organization that presumably understands and accepts its role as a service provider to its audience. But the aging elite of the profession see themselves more as a species of avant-garde artsts, and imagine, after decades of coddling by the old broadcasting oligopoly, that they are in fact keepers of the journalistic-cultural flame, producing masterpieces that an ignorant general public will never understand or appreciate, but that future generations will revere as brilliant insights into life, truth and society. To them, ABC's attack on Nightline is like an NEA funding cut--a direct assault not only on their financial security, but on their very self-image as "free spirits" entitled to subsidy merely for exercising their inherently valuable creative/journalistic muse. (Steve Roberts, on C-SPAN, emphasized the importance of NPR as a repository of "independent" journalism in the age of corporate control.)
Fortunately, their private-sector paymasters are more agile (and more cash-conscious) than the federal government, and are unlikely to continue to fund their anachronistic rituals much longer, now that all but the elderly among their once-captive audience have long since tuned out.
Thursday, March 07, 2002
The "One City, One Book" program, a kind of mass book club in which each US city recommends a single book for all its residents to read, provides a fascinating glimpse into the current state of American culture. After all, the act of asking an entire population to do something--even a trivial, non-taxing activity, such as wearing a symbolic token or attending a large-scale event--inevitably carries with it an air of solemnity, and is rarely resorted to except in the service of some grand and noble cause, like fighting a terrible disease or a monstrous injustice. The choices of book for these public exercises can thus tell us what writings are viewed these days as important and uplifting enough to merit an appeal for collective action.
Not all that long ago, the vast majority of Americans, if asked to select one book to recommend to their entire city, would have thought it ludicrous even to consider any choice other than the Bible. (A plurality of Americans today would likely give the same answer.) Other citizens might have selected the writings of a revered religious or political thinker. Among intellectuals, the Bible would still likely have run first, but would have faced at least some competition from, say, Plato's Republic (among philosophers and classicophiles), or one of the texts associated with America's founding (among "civil religion" advocates), or perhaps "Hamlet" (among aesthetes). Some scientifically-minded freethinkers might have voted for "The Origin of Species", or some other foundational scientific work, with a few radicals perhaps recommending Marx or Rousseau or another famous iconoclast.
Yet here we are, in 2002, and New York has chosen to read an obscure novel about a Korean-American immigrant in Brooklyn. Chicago, Cleveland, and other cities picked "To Kill a Mockingbird"--a good novel, at least, and with an uplifting (if somewhat ham-handedly didactic) theme. Seattle, the pioneer of the campaign, started in 1996 with "The Sweet Hereafter", by Russell Banks (since made into a movie), and went pretty much downhill from there. Los Angeles selected Ray Bradbury's "Farenheit 451", a science-fiction novel about a future in which all books are burned, and the hero's purpose in life (forgive me for revealing the ending) turns out to be the preservation of the biblical book of Ecclesiastes in his memory for posterity. (Apparently, the book of Ecclesiastes itself contained insufficient wisdom to satisfy Angeleno readers.)
What about elite opinion? When Slate asked a number of writers to offer their "One City, One Book" recommendation, only one suggested a book that might legitimately qualify as inspirational: the Koran. (Presumably it would never even have been mentioned before six months ago; I guess nothing wins recommendations for a religious tract like the willingness of its adherents to slaughter thousands of innocents in a suicide attack. Let's hope nobody tells Pat Robertson.) A few recommendations came in for literary classics or near-classics (Dostoyevsky, James, Wharton, Dreiser and Ellison were the weightiest names on the list). A couple of writers, of course, simply took the marketing opportunity and flogged their own recent books. But a fair number followed in the footsteps of the cities, and suggested recent novels of highly dubious longevity.
Tom Wolfe once observed that in a secularized modern America, art has taken over several of the social roles once assigned to religion. But as the "One City, One Book" movement demonstrates, American art-as-religion has by now descended into the same empty New Age frivolousness that dethroned its predecessor. Is the long-predicted, oft-announced death of seriousness in America finally at hand?
Not all that long ago, the vast majority of Americans, if asked to select one book to recommend to their entire city, would have thought it ludicrous even to consider any choice other than the Bible. (A plurality of Americans today would likely give the same answer.) Other citizens might have selected the writings of a revered religious or political thinker. Among intellectuals, the Bible would still likely have run first, but would have faced at least some competition from, say, Plato's Republic (among philosophers and classicophiles), or one of the texts associated with America's founding (among "civil religion" advocates), or perhaps "Hamlet" (among aesthetes). Some scientifically-minded freethinkers might have voted for "The Origin of Species", or some other foundational scientific work, with a few radicals perhaps recommending Marx or Rousseau or another famous iconoclast.
Yet here we are, in 2002, and New York has chosen to read an obscure novel about a Korean-American immigrant in Brooklyn. Chicago, Cleveland, and other cities picked "To Kill a Mockingbird"--a good novel, at least, and with an uplifting (if somewhat ham-handedly didactic) theme. Seattle, the pioneer of the campaign, started in 1996 with "The Sweet Hereafter", by Russell Banks (since made into a movie), and went pretty much downhill from there. Los Angeles selected Ray Bradbury's "Farenheit 451", a science-fiction novel about a future in which all books are burned, and the hero's purpose in life (forgive me for revealing the ending) turns out to be the preservation of the biblical book of Ecclesiastes in his memory for posterity. (Apparently, the book of Ecclesiastes itself contained insufficient wisdom to satisfy Angeleno readers.)
What about elite opinion? When Slate asked a number of writers to offer their "One City, One Book" recommendation, only one suggested a book that might legitimately qualify as inspirational: the Koran. (Presumably it would never even have been mentioned before six months ago; I guess nothing wins recommendations for a religious tract like the willingness of its adherents to slaughter thousands of innocents in a suicide attack. Let's hope nobody tells Pat Robertson.) A few recommendations came in for literary classics or near-classics (Dostoyevsky, James, Wharton, Dreiser and Ellison were the weightiest names on the list). A couple of writers, of course, simply took the marketing opportunity and flogged their own recent books. But a fair number followed in the footsteps of the cities, and suggested recent novels of highly dubious longevity.
Tom Wolfe once observed that in a secularized modern America, art has taken over several of the social roles once assigned to religion. But as the "One City, One Book" movement demonstrates, American art-as-religion has by now descended into the same empty New Age frivolousness that dethroned its predecessor. Is the long-predicted, oft-announced death of seriousness in America finally at hand?
Saturday, March 02, 2002
Let's say you're a Middle Eastern theocracy with a terrible human rights record, a long history of support for terrorism abroad, and a mounting internal dissension problem at home. The US president has just declared your state part of an "axis of evil", and is busy mopping up the remnants of the faction that used to rule a weaker neighbor of yours. You'd like to get the Americans off your back and re-establish your sphere of influence in that neighboring country, but as things stand, America doesn't seem too enthusiastic about permitting you to survive, let alone permitting you to launch a regional power-grab disguised as a "rebuilding effort". It would take a truly well-connected ally even to get a reasonable American hearing for your rather far-fetched plan in the current climate.
How about the New York Times?
This past Thursday, the Times published an op-ed column by a pro-Iranian academic and a former director-general of the Iranian foreign ministry, advocating just such a plan. Now, I have no idea how the Times selects its op-ed pieces from among the thousands of proposals that no doubt flood its mailbox. And I have no reason to believe that there was anything the slightest bit questionable about their decision-making process in this case. But the fact remains that the Times happily served as a conduit for the open promotion of the transparently self-interested schemes of a brutal dictatorship that has sworn eternal enmity to the US. (The Times' provision of Op-Ed space to Yasser Arafat can at least be chalked up to the popularity--especially among journalists--of certain wild illusions regarding Arafat's statesmanship, peaceable intentions, and lack of hostility to the US. It's hard to believe that the Gray Lady's editors suffer from any similar naivete with regard to the Iranian mullahs.)
It's a free country, of course, and the Times can publish whatever it pleases. But two recent events make the editorial contents of a prominent soapbox like the Times op-ed page more of an issue of public concern than it might have been in the past. First, as I have pointed out previously, the Enron "punditgate" flap, in which it was revealed that a number of well-known commentators had quietly been paid what looked suspiciously like a retainer by a large corporation, raises the question of whether the power that elite pundits wield (by their own admission, and obviously in Enron's estimation as well) is safe in their hands. Secondly, the campaign finance reform legislation which is making its way through congress contains provisions (thought by many to be unconstitutional) that restrict certain kinds of campaign fundraising and advertising. If these provisions pass and are upheld, then the value of "free media" channels such as New York Times op-eds increases substantially. And where there is a valuable property and a host of willing buyers, the possibility that commerce might break out cannot be dismissed.
How about the New York Times?
This past Thursday, the Times published an op-ed column by a pro-Iranian academic and a former director-general of the Iranian foreign ministry, advocating just such a plan. Now, I have no idea how the Times selects its op-ed pieces from among the thousands of proposals that no doubt flood its mailbox. And I have no reason to believe that there was anything the slightest bit questionable about their decision-making process in this case. But the fact remains that the Times happily served as a conduit for the open promotion of the transparently self-interested schemes of a brutal dictatorship that has sworn eternal enmity to the US. (The Times' provision of Op-Ed space to Yasser Arafat can at least be chalked up to the popularity--especially among journalists--of certain wild illusions regarding Arafat's statesmanship, peaceable intentions, and lack of hostility to the US. It's hard to believe that the Gray Lady's editors suffer from any similar naivete with regard to the Iranian mullahs.)
It's a free country, of course, and the Times can publish whatever it pleases. But two recent events make the editorial contents of a prominent soapbox like the Times op-ed page more of an issue of public concern than it might have been in the past. First, as I have pointed out previously, the Enron "punditgate" flap, in which it was revealed that a number of well-known commentators had quietly been paid what looked suspiciously like a retainer by a large corporation, raises the question of whether the power that elite pundits wield (by their own admission, and obviously in Enron's estimation as well) is safe in their hands. Secondly, the campaign finance reform legislation which is making its way through congress contains provisions (thought by many to be unconstitutional) that restrict certain kinds of campaign fundraising and advertising. If these provisions pass and are upheld, then the value of "free media" channels such as New York Times op-eds increases substantially. And where there is a valuable property and a host of willing buyers, the possibility that commerce might break out cannot be dismissed.
Subscribe to:
Posts (Atom)