Sunday, January 27, 2002
Thomas Friedman approvingly quotes Adrian Karatnycky of Freedom House comparing the WTC hijackers with the European terrorists of the seventies. And certainly there are a lot of parallels to observe: affluent, educated backgrounds; a long period of subsidized student aimlessness, culminating in a drift towards an apocalyptic, violence-worshipping creed; a foreign country willing to fund, arm, train and shelter them as part of a program of undermining the West's geopolitical reach. So what does Friedman conclude is the motivation for the recent crop of Europe-bred Islamicist terrorists? "[A] poverty of dignity." That's right--he claims that they were "[f]rustrated by the low standing of Muslim countries in the world....and the low standing in which they were personally held where they were living". Does he really believe that if only Mohammed Atta had been more warmly welcomed in Hamburg, there would have been no WTC attack? Does he also believe, then, that Andreas Baader and Ulrike Meinhof turned to senseless violence as a result of their own harsh treatment at the hands of the inhumane German welfare state? Does he even read his own columns?
A week ago I argued that regardless of the stringency of disclosure regulations, the opacity of Enron's books should have been a red flag to responible investors long ago (and was, in fact recognized as such by a tiny minority of analysts). Well, it turns out that Enron investors should be even more embarrassed right now than I had realized. According to the New York Times, accountants for a German utility company considering a merger with Enron back in 1999 discovered enough skeletons in Enron's closet in the course of their two-week "due diligence" to sour the Germans on the deal. "The consultants drew on public sources like trade publications and securities filings", according to the Times' sources, and "concluded that Enron had shifted so much debt off its balance sheet accounts that the company's total debt load amounted to 70 to 75 percent of its value"--considerably more than the debt-rating agencies' estimate of 54%. Said one source: "'We were wondering why this wasn't common knowledge, or why it wasn't discovered by those people whose business it was to discover these things.'" Indeed. "'When things were going well,' said one of the people involved in analyzing the deal, 'the view among those who knew about this kind of stuff was that Enron was being Enron, which meant being clever.'"
Of course, that was back in the "denial" stage. Now that the country has moved on to "anger", even the revelation that Enron's condition had long been apparent to anyone who bothered to check can prompt journalists like Josua Micah Marshal to ask suspiciously, "Who else knew?" I'm not quite sure what he's getting at, but I think he means that the president or other administration officials may have known for a while that Enron was a turkey, and yet failed to--uh, what? Publish an investment newsletter? Short the stock? Start an "Enron Sucks" Website? And why, of all the garbage stocks in hype-addled companies foolishly hoarded by millions of pigeons throughout the market's spectacular descent from grossly undeserved highs to richly deserved lows, was it somebody's responsibility to do something specifically about Enron?
Of course, that was back in the "denial" stage. Now that the country has moved on to "anger", even the revelation that Enron's condition had long been apparent to anyone who bothered to check can prompt journalists like Josua Micah Marshal to ask suspiciously, "Who else knew?" I'm not quite sure what he's getting at, but I think he means that the president or other administration officials may have known for a while that Enron was a turkey, and yet failed to--uh, what? Publish an investment newsletter? Short the stock? Start an "Enron Sucks" Website? And why, of all the garbage stocks in hype-addled companies foolishly hoarded by millions of pigeons throughout the market's spectacular descent from grossly undeserved highs to richly deserved lows, was it somebody's responsibility to do something specifically about Enron?
Friday, January 25, 2002
Why, asks Slate's Michael Kinsley, does society treat political "spin" with so much more indulgence than it accords plain, old-fashioned lies? Though Kinsley treats the question as a great enigma, the answer is in fact clear to anyone who understands the ambiguous nature of what might be described as "public morality".
A private individual cannot "spin" about himself; he can only lie outright ("the check is in the mail"; "I'll respect you in the morning") for his own selfish benefit, or tell a "white lie" ("I love your tie") for the unselfish benefit of somebody else. The former is of course generally considered unethical, whereas the latter is usually condoned or even applauded for its positive effects (or at least its positive intentions).
For public figures such as corporate executives and politicians, on the other hand, there is an intermediate possibility: they can lie for the benefit their own organizations or constituencies. This type of lying is neither completely selfless nor completely selfish, and is therefore subject to a more subtle set of rules. For example, it is usually considered unacceptable to lie about objective matters ("our check is in the mail", "I did not have sexual relations with that woman"), even for the sake of one's own group; however, it may be an acceptable expression of loyalty to lie about one's own subjective feelings or opinions with respect to group endeavours ("I'm really excited about our electoral prospects"; "I really like our new line of ties"), if the group would thus benefit. This type of lying is known as "spin".
While most non-members of the group tend to treat such spin as innocuous, opponents of the group, or anyone else who stands to lose if the spinning succeeds, may well object to it. If the complainers also form a group, then members of that second group can falsely express sputtering outrage at the putative dishonesty of the first group's members--thus generating "counterspin". And, of course, the cycle can escalate, sometimes to quite a pitch of hostility--as if two acquaintances were battling furiously over which felt more passionate affection for the other's tie.
As in most matters of etiquette, the public has a fairly clear but largely unconscious understanding of the rules of spin; hence the lack of outcry when a politician engages in flagrant, shameless spinning. I would have thought that Kinsley, usually an astute observer of social conventions, would have penetrated this mystery long ago. But then, perhaps he did, and his professed puzzlement--justifying a whole column rather than this simple three-paragraph explanation--is merely spin delivered for the benefit of his magazine.
A private individual cannot "spin" about himself; he can only lie outright ("the check is in the mail"; "I'll respect you in the morning") for his own selfish benefit, or tell a "white lie" ("I love your tie") for the unselfish benefit of somebody else. The former is of course generally considered unethical, whereas the latter is usually condoned or even applauded for its positive effects (or at least its positive intentions).
For public figures such as corporate executives and politicians, on the other hand, there is an intermediate possibility: they can lie for the benefit their own organizations or constituencies. This type of lying is neither completely selfless nor completely selfish, and is therefore subject to a more subtle set of rules. For example, it is usually considered unacceptable to lie about objective matters ("our check is in the mail", "I did not have sexual relations with that woman"), even for the sake of one's own group; however, it may be an acceptable expression of loyalty to lie about one's own subjective feelings or opinions with respect to group endeavours ("I'm really excited about our electoral prospects"; "I really like our new line of ties"), if the group would thus benefit. This type of lying is known as "spin".
While most non-members of the group tend to treat such spin as innocuous, opponents of the group, or anyone else who stands to lose if the spinning succeeds, may well object to it. If the complainers also form a group, then members of that second group can falsely express sputtering outrage at the putative dishonesty of the first group's members--thus generating "counterspin". And, of course, the cycle can escalate, sometimes to quite a pitch of hostility--as if two acquaintances were battling furiously over which felt more passionate affection for the other's tie.
As in most matters of etiquette, the public has a fairly clear but largely unconscious understanding of the rules of spin; hence the lack of outcry when a politician engages in flagrant, shameless spinning. I would have thought that Kinsley, usually an astute observer of social conventions, would have penetrated this mystery long ago. But then, perhaps he did, and his professed puzzlement--justifying a whole column rather than this simple three-paragraph explanation--is merely spin delivered for the benefit of his magazine.
Thursday, January 24, 2002
According to MSNBC, moderate alcohol consumption may actually help ward off Alzheimer's disease. Disturbing news, I guess, for those who drink to forget....
Wednesday, January 23, 2002
A "public intellectual" is a lot like a "gourmet burger"--a fusion of two completely separate worlds, each with their own independent standards. Sometimes the addition of fancy toppings makes for a better burger, and sometimes it doesn't; but either way, it's a burger, and it'll be judged as one, regardless of the pedigree of the sauce. So it is with the public intellectual: nice though it can be for publicly engaged figures to have intellectual credentials, a glittering academic record is neither necessary nor sufficient for an outstanding contribution to the public sphere.
Take the man who appears to be everyone's favorite hall-of-fame public intellectual in recent discussions: George Orwell. Now, it's actually a bit odd to lionize Orwell, an itinerant journalist, novelist and essayist, specifically as an intellectual; his abstract ideas were not particularly original, nor did he make a lasting contribution to any particular academic discipline that I know of. Rather, his great achievement was part political, part artistic: he depicted through his writings, in a searingly vivid way, the character, structure and evolution of the political phenomenon known as totalitarianism.
His descriptive skill alone would not have made him a successful public intellectual, of course; after all, Hunter S. Thompson depicted in a searingly vivid way the character, structure and evolution of drug binges, and today he's a mostly-forgotten cult figure. Orwell's particular judgment and insight, on the other hand, was of immense value to Western society, which did mammoth battle with totalitarianism in his day, and has continued to do so, on and off, ever since. Intellectuals have to be judged brilliant and original by peers, but the acid test of a public intellectual like Orwell is to be judged correct and prescient by history. Those are two very different criteria, and it's not surprising that exceedingly few people come out as giants under both of them.
That's why Richard Posner's complaint about the decline of the gourmet burger--whoops, public intellectual--is as far off-base as the many counterarguments raised against him. Posner worries that modern public intellectuals, because they are less often distinguished academics than they used to be, are therefore less likely to be rigorous thinkers. But that just misses the point. Nobody denies that Martin Heidegger, to name just one famous example, was a paragon of rigor as a philosopher; as a public intellectual, though--well, it's hard to think of a worse blunder than becoming a Nazi apologist. The ranks of highly successful intellectuals--William Shockley, Noam Chomsky, Anthony Blunt and Paul de Man, just to name a few--who tripped up embarrassingly when trying to transfer the brilliant clarity of their thinking to the messy, muddy world of public affairs is long indeed; meanwhile, the giants of public affairs--King, Ghandi, Churchill, Walesa, Mandela, and so many more--are revered not for their intellectual contributions, but simply for being wise, brave and right when their societies needed them to be.
So before we answer Posner's question of whether formal academics are better or worse than they once were at making the leap into public life, we should first ask why we should care. After all, a burger with truffles and brie may be more chichi than a mushroom cheeseburger, but it isn't necessarily a better burger.
Take the man who appears to be everyone's favorite hall-of-fame public intellectual in recent discussions: George Orwell. Now, it's actually a bit odd to lionize Orwell, an itinerant journalist, novelist and essayist, specifically as an intellectual; his abstract ideas were not particularly original, nor did he make a lasting contribution to any particular academic discipline that I know of. Rather, his great achievement was part political, part artistic: he depicted through his writings, in a searingly vivid way, the character, structure and evolution of the political phenomenon known as totalitarianism.
His descriptive skill alone would not have made him a successful public intellectual, of course; after all, Hunter S. Thompson depicted in a searingly vivid way the character, structure and evolution of drug binges, and today he's a mostly-forgotten cult figure. Orwell's particular judgment and insight, on the other hand, was of immense value to Western society, which did mammoth battle with totalitarianism in his day, and has continued to do so, on and off, ever since. Intellectuals have to be judged brilliant and original by peers, but the acid test of a public intellectual like Orwell is to be judged correct and prescient by history. Those are two very different criteria, and it's not surprising that exceedingly few people come out as giants under both of them.
That's why Richard Posner's complaint about the decline of the gourmet burger--whoops, public intellectual--is as far off-base as the many counterarguments raised against him. Posner worries that modern public intellectuals, because they are less often distinguished academics than they used to be, are therefore less likely to be rigorous thinkers. But that just misses the point. Nobody denies that Martin Heidegger, to name just one famous example, was a paragon of rigor as a philosopher; as a public intellectual, though--well, it's hard to think of a worse blunder than becoming a Nazi apologist. The ranks of highly successful intellectuals--William Shockley, Noam Chomsky, Anthony Blunt and Paul de Man, just to name a few--who tripped up embarrassingly when trying to transfer the brilliant clarity of their thinking to the messy, muddy world of public affairs is long indeed; meanwhile, the giants of public affairs--King, Ghandi, Churchill, Walesa, Mandela, and so many more--are revered not for their intellectual contributions, but simply for being wise, brave and right when their societies needed them to be.
So before we answer Posner's question of whether formal academics are better or worse than they once were at making the leap into public life, we should first ask why we should care. After all, a burger with truffles and brie may be more chichi than a mushroom cheeseburger, but it isn't necessarily a better burger.
Monday, January 21, 2002
To me, the most fascinating aspect of the Paul Krugman/Enron flap is not that he took $50,000 from the company and wrote glowingly about it before turning on it when it collapsed, or that he then lambasted certain politicians for doing roughly the same thing, or that other journalists are trashing him for his supposed hypocrisy, or that his allies are defending his positions as completely consistent. That kind of partisan squabbling is hardly new. What is new is that I have yet to see a single soul invoke the journalist's traditional "ink-stained wretch" defense: "I'm just a powerless reporter, pen in hand, saying what I think; it's the bigshot politicians, the people who make the rules for the rest of us, who must be watched carefully, not weak li'l ol' me." At one time, any demand that journalists be as accountable as the politicians they cover would have been met with an impenetrable wall of this sort of rhetoric. But it seems as though the claim that a columnist for the New York Times is less influential (and therefore less deserving of public scrutiny) than, say, a random member of the House Banking Committee just won't fly anymore.
If that's true, then we may well be witnessing the first few drops of a hurricane, as the complex modern culture of state-of-the-art partisan scandalmongering, having obliterated the last traces of traditional gentleman's politics, careens towards the formerly irenic pastures of gentleman's journalism. Will top scribes ever again be able to flit back and forth between government or industry flack jobs and press gigs with ease? Will speaking engagements, book contracts and "gifts" to reporters be studied with the same level of suspicion now applied to transactions involving politicians? Will their private lives, as well, be fair game? Or will the public, disillusioned by an endless stream of imbroglios, eventually demote the elite press back down to their original social level alongside gossips, spies, and, uh, bloggers? Only time will tell....
If that's true, then we may well be witnessing the first few drops of a hurricane, as the complex modern culture of state-of-the-art partisan scandalmongering, having obliterated the last traces of traditional gentleman's politics, careens towards the formerly irenic pastures of gentleman's journalism. Will top scribes ever again be able to flit back and forth between government or industry flack jobs and press gigs with ease? Will speaking engagements, book contracts and "gifts" to reporters be studied with the same level of suspicion now applied to transactions involving politicians? Will their private lives, as well, be fair game? Or will the public, disillusioned by an endless stream of imbroglios, eventually demote the elite press back down to their original social level alongside gossips, spies, and, uh, bloggers? Only time will tell....
MSNBC reports that Nokia is planning to market, at $21,000 apiece, hand-crafted luxury cellular phones. The images flood my mind: traditional Lapp fab workers in their colorful ancestral clean-room suits, hand-painting the photolithography masks used to manufacture the DSP chips; customized "signature" ring tones personally composed and performed by famous musicians; and, of course, tuxedo-clad Nokia employees on hand to hand-carry each bit of GSM data carefully from the telephone to the nearest cell tower. When it comes to wireless telecommunications, after all, newfangled gadgets just can't hold a candle to the old ways....
Sunday, January 20, 2002
According to an editorial in the New Republic, the real Enron scandal is not influence-peddling but deregulation; the editorial lists several proposed laws that, they claim, would have "mitigated" the effects of the Enron collapse had conservative ideologues not blocked them. Now, some of these proposals make good sense to me; employee 401(k) plans, for instance, should not be allowed to invest in employer stock, because of the double risk, in the event of an Enron-like meltdown, to both job and savings. And accounting reforms are clearly necessary to keep auditors from being in the pockets of their big customers.
But a good antidote to the misconception that regulations could have saved the world from the Enron debacle is the now-famous Fortune article by Bethany McLean from last March. Ms. McLean is being feted these days for her uncanny prescience in asking back then, in the face of analysts' near-universal plaudits for the stock, "Is Enron Overpriced?". But a gander at her piece makes it plain that she was merely asking an obvious question that post-dot-com investors, desparate for a comeback from their recent tech baths, were too blindly enthusiastic to worry about: "It's in a bunch of complex businesses. Its financial statements are nearly impenetrable. So why is Enron trading at such a huge multiple?"
Sure, Enron's books might (or might not) have been more transparent under more stringent reporting rules. But plenty of analysts and investors who knew they had no idea how Enron was making its dough (other than that it somehow involved high-risk derivatives) still went ahead and bought the stock anyway, even at a P/E ratio well into the fifties. And employees who voluntarily loaded up their 401(k) plans with it, apparently either not knowing or not caring that filling your 401(k) with your employer's stock is just about the dumbest investing move you can make, would only have been forced under a proposed new regulation to fall for some other company's incomprehensible fly-by-night investment "opportunity" instead. Reasonable though these suggested rules might be, they would hardly have averted disaster.
As I explained in my post of January 13th, the Enron fiasco is just a minor, undistinguished footnote to the investment bubble of the nineties, which saw millions of investors lose their shirts on spectacularly reckless investments (and whose collapse, I surmise, is still far from over). Targeting a scapegoat (whether Enron or deregulation boosters) and pushing new regulations--however sensible--as the solution, ignores the underlying problem: a culture of rampant irresponsibility in individual investment behavior. And it thus vitually guarantees that the problem will persist, as citizen investors, reassured by harsh punishments and new regulations, continue the now-conventional practice of pouring their life savings into overvalued stock in flimsy companies with incomprehensible business models and questionable (but perhaps now fully disclosed) balance sheets.
It's a scandal, I tell you.
But a good antidote to the misconception that regulations could have saved the world from the Enron debacle is the now-famous Fortune article by Bethany McLean from last March. Ms. McLean is being feted these days for her uncanny prescience in asking back then, in the face of analysts' near-universal plaudits for the stock, "Is Enron Overpriced?". But a gander at her piece makes it plain that she was merely asking an obvious question that post-dot-com investors, desparate for a comeback from their recent tech baths, were too blindly enthusiastic to worry about: "It's in a bunch of complex businesses. Its financial statements are nearly impenetrable. So why is Enron trading at such a huge multiple?"
Sure, Enron's books might (or might not) have been more transparent under more stringent reporting rules. But plenty of analysts and investors who knew they had no idea how Enron was making its dough (other than that it somehow involved high-risk derivatives) still went ahead and bought the stock anyway, even at a P/E ratio well into the fifties. And employees who voluntarily loaded up their 401(k) plans with it, apparently either not knowing or not caring that filling your 401(k) with your employer's stock is just about the dumbest investing move you can make, would only have been forced under a proposed new regulation to fall for some other company's incomprehensible fly-by-night investment "opportunity" instead. Reasonable though these suggested rules might be, they would hardly have averted disaster.
As I explained in my post of January 13th, the Enron fiasco is just a minor, undistinguished footnote to the investment bubble of the nineties, which saw millions of investors lose their shirts on spectacularly reckless investments (and whose collapse, I surmise, is still far from over). Targeting a scapegoat (whether Enron or deregulation boosters) and pushing new regulations--however sensible--as the solution, ignores the underlying problem: a culture of rampant irresponsibility in individual investment behavior. And it thus vitually guarantees that the problem will persist, as citizen investors, reassured by harsh punishments and new regulations, continue the now-conventional practice of pouring their life savings into overvalued stock in flimsy companies with incomprehensible business models and questionable (but perhaps now fully disclosed) balance sheets.
It's a scandal, I tell you.
Saturday, January 19, 2002
Aficionados of political debate are familiar with "and so we see...." stories--pedantic screeds spinning a single simplistic anecdote into a supposedly iron-clad proof of some partisan thesis: that the private sector always works better than the government (or vice versa), or that moralism (or realism) is always the more successful foreign policy, or that prohibition (or legalization) of vices always causes far more problems than it solves. The academic world has its own favorite topic for its "and so we see...." stories: whether research should always be directed by pure scientific curiosity or by practical societal needs. A perfect example is Siddhartha Mukherjee's recent article in The New Republic, in which the tale is told of a scientist's doggedly impractical study of the anthrax bacterium, leading ultimately to a now-invaluable antidote to its toxin. And so we see, from this inspiring fable, that Vannevar Bush's legendary postwar decision to base government funding of research on peer review rather than politically determined goals was a necessary prelude to every scientific advance made since then.
The most obvious thing wrong with this claim is that it simply doesn't reflect reality. Scientific research in America may be peer-reviewed, but government funding priorities are set with plenty of input from the political echelon. John Collier's study of the anthrax toxin may sound relatively obscure and useless to a doctor, like Mukherjee, accustomed to research aimed directly at curing human diseases, but to, say, a particle physicist, Collier's work is about as practical as it gets. Real research isn't divided into neat "pure" and "applied" categories; it's all at least somewhat influenced by both scientific and human values.
And what mix of the two produces the greatest breakthroughs? Nobody knows. Alexander Fleming's discovery of penicillin was the result of his aimless fiddling with mold specimens--but Salk's and Sabin's polio vaccines were the results of heavily funded, highly directed applied research programs. And for each such breakthrough, there are numerous examples of both curiosity-driven and goal-driven research that went nowhere and accomplished nothing. In practice, both the directed and undirected approaches to research can play a useful role in identifying topics ripe for rapid progress, and compensating for the other's blind spots. Collier is certainly correct, for instance, in pointing out that targeted funding can produce "a lot of junk aimed at getting some of that pork-barrel money"; but peer-reviewed funding tends to produce entire pools of researchers reinforcing each other's misplaced interest in the minutiae of long-sterile fields. Both errors can be mitigated by judicious application of both peer review and societal guidance.
If the history of great scientific advances suggests anything about what fosters them, it is that they tend to occur when their time has come, pretty much regardless of the research environment of the moment. Think of the classic discoveries (calculus and evolution are two famous examples) that were made entirely independently, around the same time, by two completely separate individuals with the same spontaneous insight. More recently, the relative isolation of the East Bloc scientific community during the cold war provided numerous cases of the exact same phenomenon (as has the separation of the "classified" and "public" research worlds in the West). Great breakthroughs have resulted from unfettered research freedom, from narrowly circumscribed, goal-oriented effort, and from every gradation in between. And anyone (like Mukherjee) who claims to have the perfect recipe for successful research investment in the future obviously doesn't understand the baffling history of research's past.
The most obvious thing wrong with this claim is that it simply doesn't reflect reality. Scientific research in America may be peer-reviewed, but government funding priorities are set with plenty of input from the political echelon. John Collier's study of the anthrax toxin may sound relatively obscure and useless to a doctor, like Mukherjee, accustomed to research aimed directly at curing human diseases, but to, say, a particle physicist, Collier's work is about as practical as it gets. Real research isn't divided into neat "pure" and "applied" categories; it's all at least somewhat influenced by both scientific and human values.
And what mix of the two produces the greatest breakthroughs? Nobody knows. Alexander Fleming's discovery of penicillin was the result of his aimless fiddling with mold specimens--but Salk's and Sabin's polio vaccines were the results of heavily funded, highly directed applied research programs. And for each such breakthrough, there are numerous examples of both curiosity-driven and goal-driven research that went nowhere and accomplished nothing. In practice, both the directed and undirected approaches to research can play a useful role in identifying topics ripe for rapid progress, and compensating for the other's blind spots. Collier is certainly correct, for instance, in pointing out that targeted funding can produce "a lot of junk aimed at getting some of that pork-barrel money"; but peer-reviewed funding tends to produce entire pools of researchers reinforcing each other's misplaced interest in the minutiae of long-sterile fields. Both errors can be mitigated by judicious application of both peer review and societal guidance.
If the history of great scientific advances suggests anything about what fosters them, it is that they tend to occur when their time has come, pretty much regardless of the research environment of the moment. Think of the classic discoveries (calculus and evolution are two famous examples) that were made entirely independently, around the same time, by two completely separate individuals with the same spontaneous insight. More recently, the relative isolation of the East Bloc scientific community during the cold war provided numerous cases of the exact same phenomenon (as has the separation of the "classified" and "public" research worlds in the West). Great breakthroughs have resulted from unfettered research freedom, from narrowly circumscribed, goal-oriented effort, and from every gradation in between. And anyone (like Mukherjee) who claims to have the perfect recipe for successful research investment in the future obviously doesn't understand the baffling history of research's past.
Wednesday, January 16, 2002
The New York Times' perennially fashionably short-sighted foreign affairs columnist, Thomas Friedman, argues that Afghans are "so war-weary and starved for security" that they'd welcome American troops to "police the whole place". Funny--that's what they supposedly said about the Taliban back in 1996 (not to mention Somalia in 1993). Somebody please buy this guy a ticket to see "Black Hawk Down"....
Tuesday, January 15, 2002
By critically contrasting the world's apathy towards Robert Mugabe's despotism in Zimbabwe with eighties-era global enthusiasm for ending apartheid in South Africa, Slate's Anne Applebaum once again demonstrates her remarkable geopolitical naivete. Of course the world was willing to organize boycotts and diplomatic sanctions against Pretoria back then; it was cheap, easy, morally heartwarming, and not entirely unlikely to be effective. After all, South Africa was being run by an affluent middle-class elite vulnerable both to moral arguments and to checkbook coercion--the perfect targets for an international campaign of sacrifice-free symbolic measures.
Zimbabwe, on the other hand, is being run by a brutal dictator who is quite happy to slaughter thousands of his subjects and pitilessly impoverish the rest in order to maintain his iron grip on power. Boycotts, diplomatic sanctions and other mind tricks will not work on this Jedi; the only way to persuade him to step down is to show him the other end of the gun he's so ready to use on his countrymen. And how many soldiers' lives are the world's democracies willing to put in jeopardy to effect what may turn out (depending on the decency of Mugabe's eventual successor) to be a disappointingly small improvement in the quality of life of the average Zimbabwean?
Western interventionism comes in two flavors: self-interested and purely altruistic. Only the former can muster enough domestic support in a democracy to motivate genuine sacrificial effort (read: military action), and either variety will be shunned in any event unless its chances of success are substantial. If there were any hope that earnest protest marches and UN-sponsored sanctions might sway Robert Mugabe, I'm sure that the world's Anne Applebaums would be organizing some as we speak. But apart from Ms. Applebaum herself, the ranks of such doe-eyed innocents are understandably quite thin.
Zimbabwe, on the other hand, is being run by a brutal dictator who is quite happy to slaughter thousands of his subjects and pitilessly impoverish the rest in order to maintain his iron grip on power. Boycotts, diplomatic sanctions and other mind tricks will not work on this Jedi; the only way to persuade him to step down is to show him the other end of the gun he's so ready to use on his countrymen. And how many soldiers' lives are the world's democracies willing to put in jeopardy to effect what may turn out (depending on the decency of Mugabe's eventual successor) to be a disappointingly small improvement in the quality of life of the average Zimbabwean?
Western interventionism comes in two flavors: self-interested and purely altruistic. Only the former can muster enough domestic support in a democracy to motivate genuine sacrificial effort (read: military action), and either variety will be shunned in any event unless its chances of success are substantial. If there were any hope that earnest protest marches and UN-sponsored sanctions might sway Robert Mugabe, I'm sure that the world's Anne Applebaums would be organizing some as we speak. But apart from Ms. Applebaum herself, the ranks of such doe-eyed innocents are understandably quite thin.
Monday, January 14, 2002
According to the New York Times, DNA may one day be used to "tag" consumer items. "A trace of synthetic human DNA embedded with a hidden code and then applied to a designer dress....[could] enable someone with the key to the code to detect counterfeits."
Strangely enough, the article did not credit Monica Lewinsky with originating the idea.
Strangely enough, the article did not credit Monica Lewinsky with originating the idea.
Sunday, January 13, 2002
After peaking in late 1999, stock in Lucent Technologies lost over 90% of its value in a little more than a year--a decline of over $200 billion in market capitalization. In late 2000, it admitted to certain "financial irregularities" under its previous CEO--specifically, that it had overstated revenues by hundreds of millions of dollars. It spent most of 2001 staving off bankruptcy by selling off assets. As a blue-chip Bell system spin-off, Lucent figured prominently in the 401(k) plans of millions of Americans, including, of course, many of its own employees, whose savings were thus decimated.
Also peaking in late 1999 were shares of Xerox, which proceeded to lose nearly 90% of their value over the subsequent two years. Like Lucent, it admitted (in 2001) to a number of "accounting irregularities", involving over $800 million in previously reported revenue, and spent 2001 hovering on the edge of bankruptcy. Also like Lucent, Xerox, as a legendary technology company and stock market success story, no doubt played a part in eviscerating its share of employee 401(k) plans.
Then there are the numerous smaller companies--Cendant, Rite Aid, Lernout & Hauspie and many others--that endured, during the same period, both a stock crash and allegations of serious "financial irregularities". Taken together, these firms' combined capital losses and misrepresented revenues surely dwarf any one corporation's tally of fraud and mismanagement.
Why, then, all the fuss about Enron? Why all the misleading stories implying that Enron's employees were forced to invest in its stock (when they were not), and that its bankruptcy was a result of fraud and embezzlement (when plain old mismanagement more than sufficed to do the trick)?
A cynic would of course point to the political hay to be made by tying particular public figures to the failed company. But similar scandalmongering tactics could surely have been applied to the Lucent and Xerox cases, had the public mood been receptive. (Both are large, well-connected companies, and it's unlikely that either lacked its share of "friends" in government.) Yet the tales of their woes were largely confined to the financial pages, while Enron's are splattered across the front page.
On the other hand, a mere year ago the country was still far from even beginning to appreciate the devastating extent and impact of the investment bubble of the late nineties. Still yearning to believe in the once-magical earning power of the floundering technology sector, the public was at that time willing to cut firms like Lucent and Xerox (not to mention thousands of dubious dotcoms) a great deal of slack, even as they melted down in spectacular fashion. Today, with the economy evidently suffering from the harsh consequences of that misplaced faith, investors have entered the "anger" phase, and are looking for a scapegoat to blame. In other words, Enron chose a terrible time to go bankrupt.
Now, Enron's collapse was without question a huge commercial fiasco, and I'll certainly be happy to see any wrongdoers among its management severely punished. But the campaign of vilification against the company obscures the fact that its executives, far from being uniquely dishonest, incompetent and unscrupulous, were in fact entirely representative of the business practices of a particularly dishonest, incompetent and unscrupulous era. And until the full magnitude of that period's wholesale corruption is fully recognized and understood, it stands a good chance of being extended and repeated for many years to come.
Also peaking in late 1999 were shares of Xerox, which proceeded to lose nearly 90% of their value over the subsequent two years. Like Lucent, it admitted (in 2001) to a number of "accounting irregularities", involving over $800 million in previously reported revenue, and spent 2001 hovering on the edge of bankruptcy. Also like Lucent, Xerox, as a legendary technology company and stock market success story, no doubt played a part in eviscerating its share of employee 401(k) plans.
Then there are the numerous smaller companies--Cendant, Rite Aid, Lernout & Hauspie and many others--that endured, during the same period, both a stock crash and allegations of serious "financial irregularities". Taken together, these firms' combined capital losses and misrepresented revenues surely dwarf any one corporation's tally of fraud and mismanagement.
Why, then, all the fuss about Enron? Why all the misleading stories implying that Enron's employees were forced to invest in its stock (when they were not), and that its bankruptcy was a result of fraud and embezzlement (when plain old mismanagement more than sufficed to do the trick)?
A cynic would of course point to the political hay to be made by tying particular public figures to the failed company. But similar scandalmongering tactics could surely have been applied to the Lucent and Xerox cases, had the public mood been receptive. (Both are large, well-connected companies, and it's unlikely that either lacked its share of "friends" in government.) Yet the tales of their woes were largely confined to the financial pages, while Enron's are splattered across the front page.
On the other hand, a mere year ago the country was still far from even beginning to appreciate the devastating extent and impact of the investment bubble of the late nineties. Still yearning to believe in the once-magical earning power of the floundering technology sector, the public was at that time willing to cut firms like Lucent and Xerox (not to mention thousands of dubious dotcoms) a great deal of slack, even as they melted down in spectacular fashion. Today, with the economy evidently suffering from the harsh consequences of that misplaced faith, investors have entered the "anger" phase, and are looking for a scapegoat to blame. In other words, Enron chose a terrible time to go bankrupt.
Now, Enron's collapse was without question a huge commercial fiasco, and I'll certainly be happy to see any wrongdoers among its management severely punished. But the campaign of vilification against the company obscures the fact that its executives, far from being uniquely dishonest, incompetent and unscrupulous, were in fact entirely representative of the business practices of a particularly dishonest, incompetent and unscrupulous era. And until the full magnitude of that period's wholesale corruption is fully recognized and understood, it stands a good chance of being extended and repeated for many years to come.
Saturday, January 12, 2002
Michael Kinsley acknowledges the prevalence of liberal views among journalists, and concedes that it leads to a certain amount of bias in reporting. But this liberal "tendency" in the press, he argues, is no different from the conservative inclination of most corporate executives. His conclusion? "Liberal-bias obsessives should calm down and learn to live with it. It's really no big deal." (Tough words, indeed. So much for soft-hearted liberal compassion for the underdog, I guess.)
It's funny, though--I've been reading Kinsley for years, and not once have I ever seen him suggest, say, that environmental obsessives should quit whining about corporate polluters, or that unions should just suck it up and accept employers' miserliness and cruelty towards workers, or that civil rights advocates should stop expecting big business to treat minorities fairly. Yet surely these behaviors are by-products of the conservative, pro-business, anti-environmental proclivities of corporate executives, just as liberal bias in reporting is a by-product of the liberal views of journalists. If Kinsley really thinks that a liberal tilt in the newsroom is the moral equivalent of a conservative tilt in the boardroom, then why does he apparently consider it quite reasonable for people to take to the streets, organize boycotts, publish angry exposes, and generally speak out in righteous tones against the consequences of one, but not the other?
Of course, as a liberal, Kinsley cannot be expected to find liberal bias in the media as personally upsetting as conservative bias in industry. But for a journalist publicly to condone ideologically slanted reporting is an entirely different matter. Most CEOs, under pressure from customers, competitors and the general public, at least pay lip service to the liberal ideals of environmental friendliness, respect for workers and appreciation of diversity. And all the commercial television news organizations--even Fox News--make a show of claiming objectivity, knowing they could never market themselves successfully as openly partisan news sources. Kinsley's unapologetic "whatcha gonna do about it?" attitude towards liberal press bias is thus a clear anomaly--an expression of arrogance usually heard only from, say, representatives of powerful monopolies with no accountability.
But then, that (as Kinsley more or less admits) is exactly what liberal journalism has been--until recently, that is. The irony of Kinsley's "bias now, bias forever" stance is that it's rapidly losing its viability even as he expresses it. CNN, hemorrhaging viewership to Fox, has already begun an effort to improve the balance of its news coverage. Liberal newspapers and opinion journals no longer overwhelmingly dominate political debate, having yielded ground to their explicitly conservative counterparts as well as to more accessible, and hence more ideologically diverse, alternative fora (such as talk radio and the Internet). Kinsley's pugnaciously unrepentant admission may soon look, in retrospect, like the last gasp of a dying breed--journalism's ruling liberal establishment. I sure hope they all have their retirement funds in order.
It's funny, though--I've been reading Kinsley for years, and not once have I ever seen him suggest, say, that environmental obsessives should quit whining about corporate polluters, or that unions should just suck it up and accept employers' miserliness and cruelty towards workers, or that civil rights advocates should stop expecting big business to treat minorities fairly. Yet surely these behaviors are by-products of the conservative, pro-business, anti-environmental proclivities of corporate executives, just as liberal bias in reporting is a by-product of the liberal views of journalists. If Kinsley really thinks that a liberal tilt in the newsroom is the moral equivalent of a conservative tilt in the boardroom, then why does he apparently consider it quite reasonable for people to take to the streets, organize boycotts, publish angry exposes, and generally speak out in righteous tones against the consequences of one, but not the other?
Of course, as a liberal, Kinsley cannot be expected to find liberal bias in the media as personally upsetting as conservative bias in industry. But for a journalist publicly to condone ideologically slanted reporting is an entirely different matter. Most CEOs, under pressure from customers, competitors and the general public, at least pay lip service to the liberal ideals of environmental friendliness, respect for workers and appreciation of diversity. And all the commercial television news organizations--even Fox News--make a show of claiming objectivity, knowing they could never market themselves successfully as openly partisan news sources. Kinsley's unapologetic "whatcha gonna do about it?" attitude towards liberal press bias is thus a clear anomaly--an expression of arrogance usually heard only from, say, representatives of powerful monopolies with no accountability.
But then, that (as Kinsley more or less admits) is exactly what liberal journalism has been--until recently, that is. The irony of Kinsley's "bias now, bias forever" stance is that it's rapidly losing its viability even as he expresses it. CNN, hemorrhaging viewership to Fox, has already begun an effort to improve the balance of its news coverage. Liberal newspapers and opinion journals no longer overwhelmingly dominate political debate, having yielded ground to their explicitly conservative counterparts as well as to more accessible, and hence more ideologically diverse, alternative fora (such as talk radio and the Internet). Kinsley's pugnaciously unrepentant admission may soon look, in retrospect, like the last gasp of a dying breed--journalism's ruling liberal establishment. I sure hope they all have their retirement funds in order.
Wednesday, January 09, 2002
According to the Washington Post, it is an old custom for Catalonians to hide a figurine of a defecating man in their otherwise-traditional Nativity scenes. Nobody seems to have raised the burning question: would a Catalan-style creche on US government property violate the Establishment Clause? After all, the presence of "non-religious" seasonal displays can, according to some Supreme Court rulings, render a partially religious Christmas scene First Amendment-safe. So we must ask--do the Catalan figurines achieve separation of church and state? And if so, which is which?
Tuesday, January 08, 2002
By far the least interesting aspect of the Great Cornel West Debate is the racial angle, i.e., "what if a high-ranking white Harvard professor were dressed down for too many extracurricular activities and too many 'A' students?". A far more interesting question is, "what if a high-ranking professor--of any race--at Harvard Medical School were out recording music and running political campaigns instead of producing research, and handing out A's to students as if they were candy?" Does anybody doubt that, far from merely complaining (and then backing off), university president Lawrence Summers (and the whole Med School faculty) would have done everything in their power to purge the faculty of such an irresponsible incompetent?
The thousand-year history of academia exhibits an unmistakeable pattern: the university finds an important societal role, and for a while it flourishes; eventually its role fades in importance, or is taken over by more specialized institutions, and the university decays into irrelevance as a refuge for obscurantism and a playground for wealthy youth--until it finds its next social purpose. Educating the clergy for the Church, training lawyers, supplying royal courts with civil servants, developing and populating the discipline of engineering--all were functions that once saved academia from decrepitude, then were gradually eliminated or diverted to specialized schools, leaving the university as a whole once again adrift without a purpose.
It is not often remembered today that the Ivy League was once a collection of mostly recreational finishing schools for the scions of America's ruling class. The postwar explosion of the liberal arts-based American academy that propelled colleges like Harvard into unprecedented positions of social leadership was a result of the university's newly recognized role in educating managers to run the industries powering the complex, modern postwar economy. In time, however, those would-be managers began to segregate themselves in business schools and economics departments, and liberal arts faculties were again left without a meaningful purpose. There followed the traditional decline into obscurantism, irrelevance and frivolity; today few students attend Harvard College (as opposed to Harvard Medical School, or Harvard Business School, or MIT) for any reason other than to burnish their resumes and network with other elite students. That's why Cornel West can get away with his flagrant indulgence of self and students; nobody really thinks it's doing anybody any harm. And he can count on the support of thousands of liberal arts professors of all races who recognize that they are every bit as naked as Emperor West, and must at all costs stop little Larry Summers from exposing them all.
The thousand-year history of academia exhibits an unmistakeable pattern: the university finds an important societal role, and for a while it flourishes; eventually its role fades in importance, or is taken over by more specialized institutions, and the university decays into irrelevance as a refuge for obscurantism and a playground for wealthy youth--until it finds its next social purpose. Educating the clergy for the Church, training lawyers, supplying royal courts with civil servants, developing and populating the discipline of engineering--all were functions that once saved academia from decrepitude, then were gradually eliminated or diverted to specialized schools, leaving the university as a whole once again adrift without a purpose.
It is not often remembered today that the Ivy League was once a collection of mostly recreational finishing schools for the scions of America's ruling class. The postwar explosion of the liberal arts-based American academy that propelled colleges like Harvard into unprecedented positions of social leadership was a result of the university's newly recognized role in educating managers to run the industries powering the complex, modern postwar economy. In time, however, those would-be managers began to segregate themselves in business schools and economics departments, and liberal arts faculties were again left without a meaningful purpose. There followed the traditional decline into obscurantism, irrelevance and frivolity; today few students attend Harvard College (as opposed to Harvard Medical School, or Harvard Business School, or MIT) for any reason other than to burnish their resumes and network with other elite students. That's why Cornel West can get away with his flagrant indulgence of self and students; nobody really thinks it's doing anybody any harm. And he can count on the support of thousands of liberal arts professors of all races who recognize that they are every bit as naked as Emperor West, and must at all costs stop little Larry Summers from exposing them all.
Monday, January 07, 2002
Slate's Anne Applebaum may well be correct when she points out that the introduction of the Euro as of New Year's Day was a non-event because for better or worse, European integration--economic, legal, political--has been an established fact for quite a while now. Nevertheless, something very important did happen around January 1st, 2002: Argentina defaulted on its debt, making it official that its policy of tying its own currency tightly to the US dollar was a colossal failure. Of course, Euroskeptics have long warned that without adjustable per-nation exchange rates, parts of Europe may suddenly find themselves unable to address massive economic dislocations using the traditional monetary levers available to countries with separate currencies. And the pro-integrationists have always simply ignored the skeptics, suspecting (probably correctly in many cases) that their economic objection was merely a cover for sentimental nationalism or conservative fear of a left-leaning continent-wide "nanny state". But one might have thought that Argentina's recent spectacular refutation of the practice of forcing together the currencies of widely divergent countries would give pause to at least some of the passengers on the pro-integration bandwagon, no?
Christopher E. Babbitt, a student at Harvard Law, writes in The New Republic Online that rather than pass new legislation (like the Digital Millenium Copyright Act), the legislative branches of government should simply allow the courts to decide how to regulate new technologies, setting precedents by drawing appropriate analogies from more established fields. This is a popular stance in certain technological-activist circles; at the academic conference where the famous SDMI-hacking paper was presented (the one the Recording Industry Association of America apparently attempted to suppress by means of a nasty "lawyer letter"), an invited panel of attorneys argued vehemently that copyright infringement should be handled by judges on a case-by-case basis, in the hope that (and this is, I believe, a fair representation of what was said) well-intentioned people--say, journalists violating copyright in order to reveal corporate malfeasance--could be granted a "fair use" exemption denied to, say, for-profit music pirates.
Now, I'm no fan of DMCA, and I certainly don't approve of criminalizing the publication of information about security vulnerabilities in computer systems. But if anything could make me sympathetic towards what is, after all, a democratically enacted piece of legislation, it would be the threat of intellectual property protection falling--like so many other governmental functions--under the dictatorial control of (former) Harvard Law students. What Babbitt and colleagues are effectively proposing is that any given instance of a common, everyday activity like quoting copyrighted material be considered either legal or illegal, depending solely on whether or not a judge, upon staring deeply into the eyes of the plaintiff and defendant, decides he takes enough of a shine to the latter to apply the "fair" in "fair use" to the case at hand. The legislators who control the DMCA may (or may not) be in the pockets of corporate lobbyists, raving rednecks, whining interest groups, or other sorts who dare to interfere with the basic right of Harvard Law graduates to run the country, but they can at least be voted out of office if their particular tastes in content duplication are flaky enough. There's still something to be said for democracy--however unpopular it may be in the nation's law schools.
Now, I'm no fan of DMCA, and I certainly don't approve of criminalizing the publication of information about security vulnerabilities in computer systems. But if anything could make me sympathetic towards what is, after all, a democratically enacted piece of legislation, it would be the threat of intellectual property protection falling--like so many other governmental functions--under the dictatorial control of (former) Harvard Law students. What Babbitt and colleagues are effectively proposing is that any given instance of a common, everyday activity like quoting copyrighted material be considered either legal or illegal, depending solely on whether or not a judge, upon staring deeply into the eyes of the plaintiff and defendant, decides he takes enough of a shine to the latter to apply the "fair" in "fair use" to the case at hand. The legislators who control the DMCA may (or may not) be in the pockets of corporate lobbyists, raving rednecks, whining interest groups, or other sorts who dare to interfere with the basic right of Harvard Law graduates to run the country, but they can at least be voted out of office if their particular tastes in content duplication are flaky enough. There's still something to be said for democracy--however unpopular it may be in the nation's law schools.
Sunday, January 06, 2002
From SFGate:
"FBI agents and British anti-terrorist officials, meanwhile, have concluded that the shoe-bomb plot originated with the ideas of [Ramzi] Yousef, an early al Qaeda operative who suggested flying passenger jets into buildings......Having obtained Boeing blueprints, Yousef calculated the most devastating place to detonate his bombs was in a seat above the center fuel tank, adjacent to the wing. The bomb alone would not destroy the plane, but it would detonate the fuel, ripping the jet apart. Reid, perhaps following Yousef's example, chose a window seat close to the Boeing 767-300's fuel tank."
Boy, I sure hope somebody is checking the passenger list for the famous TWA flight 800....
"FBI agents and British anti-terrorist officials, meanwhile, have concluded that the shoe-bomb plot originated with the ideas of [Ramzi] Yousef, an early al Qaeda operative who suggested flying passenger jets into buildings......Having obtained Boeing blueprints, Yousef calculated the most devastating place to detonate his bombs was in a seat above the center fuel tank, adjacent to the wing. The bomb alone would not destroy the plane, but it would detonate the fuel, ripping the jet apart. Reid, perhaps following Yousef's example, chose a window seat close to the Boeing 767-300's fuel tank."
Boy, I sure hope somebody is checking the passenger list for the famous TWA flight 800....
Subscribe to:
Posts (Atom)