Tuesday, October 21, 2014

A while back, I noted the strikingly different cinematic treatment accorded two types of illicit romance:  the gay extramarital affair in Brokeback Mountain and the dalliance between a tennis pro and his best friend's fiancee in Match Point.  Now, a real-life episode raises a similar issue:  the rabbi at a major Washington D.C. synagogue has apparently discovered himself to be gay, and has taken the rather unusual step of publicly declaring the end of his marriage on those grounds, to a generally celebratory reception from the press.

Now, let us put aside the halachic question--we will assume that the rabbi in question has determined his "coming out" to be in accord with Jewish law as he understands it.  (And as a Conservative rabbi, he would most likely have plenty of company within his denomination.)  More interesting to me is his public declaration that his recent self-discovery has made it necessary to end his marriage of twenty years.  Although he never explicitly gives a reason for this decision, we are left to assume that, having realized that he can no longer pretend to be romantically attracted to his wife, he has no choice but to end the charade and live life as a (presumably non-celibate) gay man. 

Which leads me to wonder:  what if, instead of discovering that he is attracted exclusively to men, he had in fact discovered himself to be attracted to some other group that does not include his wife--say, younger, prettier women?  Would an announcement that he has chosen to be honest with himself and the world, and live life as a straight man attracted to twenty-something hotties, have been greeted with such warmth and understanding?  And if not, why not?

The issue of social acceptance of gay and lesbian pairings is often treated as a matter of simple equality:  people who happen to be sexually attracted to members of the same sex shouldn't be treated differently from people who happen to be attracted to members of the opposite sex.  At other times, it's treated as a matter of personal freedom--everyone should be allowed the freedom to follow his or her sexual desires wherever they may lead, as long as all participants are consenting adults.  But if a middle-aged rabbi's attraction to men is different from his hypothetical attraction to younger (adult) women--if one is publicly celebrated, while the other never would be--then neither freedom nor equality quite captures the principle being demonstrated here.

A more consistent interpretation would be that we are in the process of establishing an entirely new set of sexual mores, quite different from traditional ones, but not necessarily any less prescriptive.  (In another early blog post, I referred to it as the "college consensus"--that is, the set of beliefs and standards of behavior that the college-age cohort estimates will maximize their social attractiveness and desirability among their peers.)  Like the more traditional set, this new set of standards will have winners and losers--the already-fortunate being disproportionately winners, as always, and the relatively unfortunate disproportionately losers--and will evolve as times and circumstances change.  (In yet another earlier post, I suggested that economic and technological progress were the trigger for the widespread abandonment of "traditional values" in the sexual arena.)

And, just as with the pre-1960s set of conventions, adherents of the new conventions will act as though their conformist moral judgments are a matter of basic common sense and decency, and never think to consider the contradictions and contingencies embedded in their worldview.      

Thursday, October 16, 2014

Next:  "football".  (Followed by, "Republican"...)

Sunday, July 27, 2014

A few random comments on the current events in the Gaza Strip...
  • I told you so...
  • On the other hand, I probably jumped the gun a bit on this one--witness this recent tweet from the Guardian's Gaza correspondent, in which he echoes Hamas' sentiment that a life-saving ceasefire is worthless unless it preserves Hamas' capacity to invade Israel and murder civilians via its tunnel infrastructure.  (Then again, it's the Guardian, which was also happy to publish this recent op-ed...)  Certainly, the polarization process I outlined eleven years ago--and thought had already peaked--has in fact continued unabated since then, to the point where the two sides now effectively view each other as satanically evil.  It will be interesting to see whether it can go any further, or whether the nakedly pro-terrorist (and by now routinely anti-Semitic) positions adopted by the anti-Israel side these days have reached a level of extremism that will begin to drive away supporters en masse
  • I expect that as soon as Israel is done with its current cleanup operation, it will redirect its efforts towards the massive project of finding and eliminating the no doubt horrifyingly large number of Hezbollah-dug tunnels lurking under the soil of Israel's northern border with Lebanon.
  • The Gaza Strip isn't the only territory ruled by adherents of an insane radical ideology that has left its destitute subjects dependent on foreign aid for their survival, while compelling them to wage an endless (and hopeless) campaign of failed conquest against its wealthier, freer American-allied neighbor, replete with tunnel-digging, senseless acts of brutal violence, and propaganda composed of almost comically absurd flourishes of over-the-top invective.  Why, then, is one leadership an object of endless ridicule in Western pop culture (the odd washed-up basketballer notwithstanding), while the other typically gets respectful or even fawning coverage?  

Thursday, June 12, 2014

I find it strange that Google's self-driving car project appears to be focused on small passenger vehicles, because the technology appears to me to have much more near-term potential in the long-haul trucking market.  Self-driving passenger cars, after all, offer buyers the relatively minor convenience of not having to pay attention to the road while riding between urban locations.  They also likely improve safety slightly--city driving not being all that dangerous to begin with--at some cost in speed (due to a less aggressive driving style). 

Self-driving semis, on the other hand, offer substantial cost savings by eliminating the driver altogether.  They likely also improve safety significantly--especially at night, when infrared sensors would surely do much better than sleep-deprived humans--as well as saving considerable time by eliminating rest and meal stops.  (For really long trips, truck stops would no doubt be happy to offer filling service for driverless trucks at a tiny fraction of the cost of a full-time driver.) 

One can easily imagine companies converting their entire fleet of trucks to driverless models, eliminating their driving staff entirely, and replacing their team of dispatchers with a few driverless truck programmers.  Putting aside the technological hurdles--which exist in both markets--it's hard to believe that the commercial trucking market wouldn't be far easier to sell on this concept than even the geekiest first-adopter consumers. 

Sunday, May 18, 2014

Controversy over the concept of "privilege" is apparently sweeping college campuses.  The latest round of debate began when a Princeton student, Tal Fortgang, published an essay on the subject, complaining that despite his immigrant Jewish heritage, he was routinely identified as benefiting from "white privilege", and even told to "check [his] privilege" when expressing unpopular opinions, as if he'd been the beneficiary of generations of upper-class status.  His argument has since been discussed in the New York Times, the Atlantic Monthly, and numerous other outlets, particularly in the partisan blogosphere.  But none, I think, have given a full explanation of the concept and its purpose.

To begin with, the very concept of "privilege", defined as a benefit ascribed to those who are claimed not to suffer from racial or ethnic discrimination, is completely redundant.  We already have the much clearer concept of discrimination itself, and even the somewhat-more-vague concept of pervasive or systemic discrimination.  Turning these around and ascribing "privilege" to those who are not victims of discrimination does nothing to make our understanding of the phenomenon clearer. 

On the contrary, it clouds understanding by implicitly recasting a quantitative phenomenon as a qualitative one.  Even its proponents admit that as defined, everyone has some degree of "privilege".  (Black people, for instance, would appear to benefit from the "privilege" of not being the victims of anti-Hispanic discrimination, and vice versa.)  But nobody ever asks a speaker to "check how much privilege you have compared to some other people".  Both its definition and its common use encourage its misguided interpretation as an all-or-nothing property. 

Even worse, this qualitative property is then attributed purely based on racial or gender category.  Thus all white people are lumped together as benefiting from "white privilege", irrespective of their personal or family attributes and experiences.  "White privilege", then, is best described as a property attributed uniformly to all members of a particular racial group.  There's a word for such attributions:  they're called racial stereotypes.

Defenders of the term argue that "white privilege" is  nevertheless different from pernicious stereotypes such as "black criminality" or "Jewish greed" in that all white people really do derive some net benefit from systemic discrimination against non-white people--that is, from "privilege".  There's a subtle sleight-of-hand in this argument:  the absence of a particular disadvantage is used to imply the presence of a benefit.  It's probably true, after all, that all white people are less negatively affected by discrimination against blacks than blacks themselves are.  But that does not imply that all whites derive a net benefit from discrimination against blacks.  On the contrary, since (as defenders of the concept of "privilege"--again--concede) discrimination is generally negative-sum, hurting non-victims as well as victims, it's likely that a great many--perhaps even most--whites suffer net harm from discrimination against blacks, rather than a net benefit, compared with a world without such discrimination.  Widespread discrimination against blacks in hiring, for instance, would have broad negative economic effects that would for most whites dwarf any hiring advantage gained as a result--particularly, say, in the case of a long-term unemployed white person, for whom an unrealized hypothetical hiring advantage has yielded precious little benefit, but whose employment problems may well have been exacerbated by the economic side effects of discrimination. 

This simple observation renders the entire concept of "privilege" completely nonsensical.  It would be absurd, for instance, to claim that Americans "benefited" from "privilege" as a result of the 2011 Japanese tsunami, even though they obviously suffered less, on average, than the direct victims did.  Likewise, given that racial or ethnic discrimination against minority groups is bad for society in general, it makes no sense to claim that all whites "benefit" from it simply because they are generally hurt less than minority group members.  Hence "white privilege" is no more accurate than "black criminality" or "Jewish greed"--they're all false and unfair generalizations based on race or ethnicity. 

There is, however, one real and significant difference:  "white privilege" wasn't generated spontaneously by a bigoted culture, but instead was deliberately invented in the American academic community and then vigorously promoted to both students and the public at large.  Why would anyone--let alone American academics, who typically declare themselves passionately devoted to racial equality--do something so perverse?

The answer is actually quite obvious:  to justify their own brand of racial discrimination.  "Affirmative action"--that is, legally sanctioned discrimination against certain racial and ethnic groups in academic and vocational contexts--is under fire in the political sphere, since it is hugely unpopular, violates important, deeply held American values and has largely failed to close the achievement gaps it was meant to address.  Its supporters needed a rationalizing justification for its continuation, and turned to the tried-and-true method of racists throughout history:  the negative racial stereotype.  Thus do the parallels between "affirmative action" and the racist discrimination it was supposedly enacted to redress continue to grow ever closer and more compelling.

Friday, March 21, 2014

This is easily the most compelling explanation I've seen so far for the strange behavior of missing Malaysia Airlines airliner flight 370.  Note that the suggested path flies very close to Iran, which suggests in turn that perhaps the two Iranian "smugglers" on the passenger list deserve some additional attention.  The remaining question, of course, is what (or more likely who) on the plane was so amazingly valuable as to be worth grabbing in such an elaborate and risky operation...

(UPDATE March 25, 2014:  The company that operates the satellite-based tracking system that received automated signals from the aircraft's engines during its flight has apparently performed a sophisticated analysis on those signals, and concluded that it actually flew south towards the Indian Ocean, rather than north towards Central Asia.  We are thus pointedly reminded--once again--that human behavior doesn't always have the most compelling explanation.)

Wednesday, January 22, 2014

An anonymous resignation letter from a Swiss graduate student has made a bit of a splash on the Internet lately, with many noting the uncanny accuracy of his unflattering portrayal of modern academic research.  I recommend a full reading, but the gist of his description is that academic researchers today have no interest in producing what an external observer would describe as "good research"--that is, research that significantly increases human knowledge and benefits humankind.  Instead, they devote their efforts to advancing their own petty academic-political interests: producing a large volume of narrow, conformist, incremental publications that improve their publication statistics and citation numbers; playing political games to advance their standing in the "research community"; and exploiting their graduate students for the benefit of their own careers, at the expense of the students'--and the field's--best interests.

It's easy to compare today's research world unfavorably with its counterpart of fifty years ago or more, since what remains of the latter by this point is little beyond its legendary successes, and certainly not its day-to-day workings.  But the truth is that academic research has always had more than its share of mediocrity drenched in politics, obscurantism and conformity.  Consider, for example, this little memoir, written in 1988 by the late great Middle East scholar Elie Kedourie (and recently pointed out by his fellow Mideast scholar Martin Kramer).  It turns out that Kedourie, as well, abandoned his doctoral studies, and for reasons not at all dissimilar to those of the Swiss graduate student.  But Kedourie was not a modern Swiss scientific researcher, but rather a history student at Oxford in 1953.

Unfortunately, the problems that have plagued researchers from Kedourie in 1953 to many young scientific researchers today are in fact fundamental weaknesses in the basic structure of academic research.  Researchers have always formed their own tiny expert communities, and thereafter demanded to be evaluated solely on the basis of peer review--that is, on the assessment of that tiny community, with no outsider being considered sufficiently expert to pass judgment on their work.  Academic newcomers are then forbidden entry until they first complete a multi-year program whose primary requirements are slavish conformity to the methods and practices of their graduate advisors' community, and voluminous difficult work exercising those methods and practices to the exacting standards of the graduate advisor and a small, hand-picked selection of his or her community peers.  And tenure has always guaranteed that these mini-communities will continue their hair-splitting line of research long after the last shred of value to outsiders has been painfully squeezed out of it.  It's hardly surprising, then, that academic research has historically been dominated by stifling conformity, petty politics and small-minded obscurantism.

If today's community is different from those of, say, a hundred years ago, the difference is primarily that research today is a "day job" in a way that it wasn't back then.  In the era when virtually every professor was a poorly-but-steadily-paid college teacher indulging his scientific, literary or historical obsessions in the gaps between classes and perhaps publishing the odd fusty monograph every few years for an audience numbered in the dozens, it scarcely mattered whether those academic obsessions were with the great unsolved problems in science or, say, the mating habits of a particular unremarkable species of butterfly.  Today, on the other hand, the average professor is fairly well-paid, with a light teaching load, either full tenure or a near-term expectation of it, and the promise of a multi-thousand-dollar research budget to spend on conference travel, graduate student assistants and other perks--if only he or she can generate the requisite volume of peer-blessed publications. 

What was once an eccentric but harmless academic idiosyncrasy--the practice of publishing technical expositions of one's own research that at most only a tiny audience of peers will ever read--has thus become an enormously expensive and wasteful boondoggle.  Abilities such as "selling" one's papers (writing them in a way that impresses one's peers), forming and leading networks of mutually logrolling researchers (much like the "alliances" in the reality TV show, "Survivor"), and crafting CVs and proposals that coax grants out of funding agencies, are now core academic skills much more important to career success than, say, deep scientific insight or vast erudition, much less teaching ability.  The resulting research bears all the signs of this change:  most of it is shallow and irrelevant, much is sloppy and error-ridden, and very little of it has a shelf life longer than the few months it takes to get it published and tacked onto a personal publications list.

It's not clear how to solve this problem, but a few obvious (though sadly unrealistic) mitigations come to mind.  First of all, since 99 percent of all research is worthless, we should start by vastly shrinking the pool of researchers.  The rule that virtually all full-time college professors must generate research as part of their employment is an absurd result of tiny colleges trying to increase their stature by emulating the top universities.  It ends up not only generating a flood of pointless, abysmally low-quality "research", but also undermining the higher-quality research communities, forcing them to compete with the mediocre majority for funds, recognition and adherents

Second, abolishing tenure certainly won't solve the entire problem--after all, most of the worst research is produced by workaholic untenured researchers, frantically churning out publications with which to establish their case for tenure.  But the protections of tenure certainly contribute to the insularity with which academic researchers indulge their worst instincts without fear of adverse consequences.  And given that academic training and peer review are guaranteed to squeeze every drop of non-conformist independent-mindedness from the peer-obsessed researcher, the likelihood of a researcher using the grant of tenure to break free from the constraints of conformity and rebel against the herd are negligible.  Tenure has thus failed in its only justifying purpose, and never comes close to paying back its enormous costs.

Third--and most importantly--the evaluation of academic research needs to be opened up to a much wider range of assessors.  As long as researchers can rely on log-rolling among peers to protect them from external accountability, they will continue to ignore any measures of the value of their research other than their own.  The only way to force them to take into account criteria such as economic value, societal impact and the public's priorities, is to make them accountable to commercial, political and popular representatives, not just fellow academics.  The howls of outraged disgust that invariably greet such suggestions reflect not only the academic dogma that asserts that anyone outside the holy circle of researchers is an ignorant yahoo incapable of grasping even the basics of evaluating scholarship, but also the rather baser fear that external evaluators might not be quite so indulgent towards researchers as they are toward themselves and each other.  It's high time that dogma were dispensed with, and that fear realized. 

Tuesday, January 07, 2014

And now for the 2013/2014 edition of ICBW's grandest (okay, only) tradition--the annual predictions post...

We begin with a review of our predictions for 2013, for which the theme is, "spectacularly accurate in the minor details, spectacularly off-base on the big picture":
  • The US economy will continue to grow at a sluggish pace, constrained by declining government stimulus and interest rates inching up with the growing concern over public debt, but buoyed slightly by lower prices for fossil fuels.  Inflation will remain tame, and the stock markets will weaken slightly.  Real estate will continue its very gentle upward trend.  
Spot-on in most respects (GDP growth 1.7 percent, bond prices down (i.e., yields up) slightly, annual inflation at 1.2 percent), but way off in a couple (Brent crude virtually unchanged, and especially the S&P 500 up over 30 percent)...
  • The newly-restabilized Eurozone will destabilize again, when domestic political pressure forces one of the bailed-out governments (most likely Greece) to balk at the required austerity measures, and/or one of the bailing-out governments (i.e., Germany) to balk at its required contributions.
Well, there was plenty of regret and foreboding, but in the end, everyone agreed to spend a fortune keeping Greece ruinously tied to the Euro...
  • Any US government spending reduction of any kind, let alone entitlement reform, that occurs this year will be purely cosmetic.  The debt will continue to expand until the threat of rising interest rates forces politicians' hands.
This was a gimme--no credit deserved for stating the obvious...
  • Damascus will fall to Syrian rebels, and chaos will ensue as Alawi Assad regime loyalists retreat to defend their stronghold in the northwest, the Kurds carve out an enclave in the northeast, and Sunni Islamists set about slaughtering minorities elsewhere.  The unrest will spread to Lebanon, where Sunnis (including Islamists) emboldened by events in Syria will begin challenging Hezbollah's dominance in earnest.  Meanwhile, in Egypt, The Muslim Brotherhood will consolidate its iron grip on power, but will be too busy dealing with economic crisis and the resulting unrest to make trouble elsewhere.  The Palestinian Authority will bring some kind of case against Israel to the International Criminal Court, where it will sit for some number of years, but the West Bank will be largely quiescent, and no "third intifada" will break out, a few occasional minor disturbances notwithstanding.  And despite increasing hostility from the Obama administration, overall global anti-Israel agitation will decline, as a result of Iranian setbacks and the increased respect that typically accrues to a potential future energy supplier to Europe.
Well, a lot of the little details were remarkably accurate--except for that bit about Assad falling, and the Muslim Brotherhood consolidating its hold on power, and the PA going to the ICC...
  • Binyamin Netanyahu's party will win the January election, and form a center-right coalition largely similar to the current one, but without Ehud Barak.  There will therefore be more settlement activity, but no military strike on Iran's nuclear program, which will remain in its current alarmingly-close-to-nuclear-weapons-but-somehow-not actually-building-them-yet state.  In fact, following the fall of Assad and the resulting decline of Hezbollah, Iran will begin to look--to all its adversaries, including both Israel and the Gulf states--like a much more manageable regional threat.  Official and unofficial expressions of anti-Iranian hostility in the Sunni world will thus become much bolder and more open.
Again, perfect--except for those bits about Netanyahu's coalition being similar to the previous one (it's completely different--the religious parties have been kicked out and replaced with secular ones).  Also, Iran looks like a much more managed regional threat--lots of agreements and diplomacy going on--but is hardly more manageable, unless you're foolish enough to think that anyone takes those agreements and diplomatic initiatives at all seriously.
  • At least one of the autocrats from 2012's list (Hugo Chavez, Ali Khamenei, King Abdullah of Saudi Arabia, Robert Mugabe, Raul Castro) will not make it through 2013. 
Thanks for bailing me out, Hugo!
  • The next hit cable TV series will break new ground by revolving around a fascinating, complex male character who's not involved in violent crime. 
Not sure if this counts...

And now for my 2014 predictions, for what they're worth...
  • The US economy will strengthen moderately in 2014.  The stock market will decline slightly from its current heady heights, but interest rates and inflation will rise slightly (although not enough to divert the Fed from its current oh-so-accommodative course).  Oil  and other commodity prices will decline, but real estate will continue its recovery.  The EuroZone will recover as well, but far more sluggishly, with continuing unrest (but no major upheaval) over the severely distressed PIIGS economies.
  • Barack Obama's approval ratings will continue to decline, weighed down primarily by Obamacare, which will continue to accumulate angry "losers" (people whose health insurance has become narrower, more expensive or both).  Numerous other minor "scandals" will pop up over the course of the year, but none will gain significant traction with the press, and the November elections will see only small shifts in Congress, with the Republicans gaining a mere handful of House seats, and the Democrats (just barely) retaining control of the Senate.  Until then, the Republicans will content themselves by blocking various White House legislative initiatives, the administration will respond by doubling down on various expansions of executive power, and the Republicans will counter by initiating various legal actions (mostly unsuccessful) against them.
  • At least one Supreme Court justice will resign or die, and Senate Democrats will abolish the filibuster completely to prevent Republican obstruction of the resulting nominee's confirmation.
  • The Israelis and Palestinians will sign a "framework agreement" modeled after the Iranian-American accord.  Like its predecessor, it will say absolutely nothing concrete and definitive, and will be interpreted by all sides as perfectly aligned with their own official position on every issue.  It will therefore accomplish absolutely nothing, apart from allowing both sides to maintain the status quo while asserting at the same time that they've made progress toward their strategic goals.  Meanwhile, redoubts of anti-Israel animus--academia, the press, Europe--will respond to the process by doubling down on their anti-Israel campaigns, including more American Studies Association-style boycotts.  However, violence will be confined to sporadic incidents, and Israel's economy and trade will continue their stellar trajectory.
  • Elsewhere in the Middle East, the civil war in Syria will drag on with no end in sight.  The related unrest in Lebanon will increase substantially, led by relatively new radical Sunni elements rebelling against Hezbollah's dominance.  Muslim Brotherhood violence in Egypt will continue at a low level, but the military will tighten its overall grip on power.  Sectarian violence in Iraq will escalate, and the Erdogan regime in Turkey will shed all pretense of democratic rule, formally instituting structural changes that will in effect establish an AKP dictatorship, with a bit of democratic window dressing. 
  • Legalization or quasi-legalization of marijuana will spread to additional states beyond Colorado and Washington, and the next big trend in snobbish consumption will be "gourmet weed".
Feel free to add your own via comments...

Saturday, November 30, 2013

Pop quiz:  What do journalists Robert Fisk and Matthew Yglesias, and activist Daniel Seidemann, have in common?  (The links provide the answer.  A hint:  One might also, in a stretch, include Timothy Treadwell...)

Wednesday, November 27, 2013

The recent deal between Iran and the P5+1 countries negotiated by the Obama administration over the former's nuclear weapons program has been lambasted, with good reason, by all but the most sycophantically pro-Obama commentators.  But while it's certainly a foolish giveaway, with lots of drawbacks and no significant redeeming features, its overall long-term effects are widely misunderstood.  Here are a few of the most prominent myths:
  • The deal profoundly imperils Israel's security.  As I've argued before, Iran's offensive nuclear capability--and it will almost certainly have one, sooner rather than later, irrespective of this deal--will be effectively deterred by both the Israeli and American capabilities, just as the Soviet Union's was.  Iran's very likely possession of nuclear weapons is thus not a major direct threat to Israel, let alone an existential one.
  • The deal is intended first and foremost to weaken Israel.  As I've also argued before, the Obama administration's guiding foreign policy principle is the desirability of diminishing American power and influence around the world.  This deal contributes substantially to that goal, and while it also harms Israeli interests, it's American interests that by far suffer the greatest harm, more than enough to amply justify it under the president's worldview.  (Of course, the two effects are directly correlated:  given that Israel is a strong ally and supporter of the US, a blow to Israeli strategic interests is highly likely to damage US strategic interests as well--and vice versa.)
  • The primary effect of the deal is to clear the way for the Iranian nuclear weapons program. In fact, the Iranian nuclear weapons program has been moving forward at full speed for many years now, and this deal scarcely affects it.  Rather, the primary (and wholly negative) effect of the deal is to undermine the sanctions regime against Iran.  Once the sanctions are lifted--and given this deal, that lifting is now inevitable--Iran will have more cash to spend on conventional mischief in the region, such as propping up its puppets in Syria and Lebanon, extending its influence in Iraq, fomenting unrest in the Gulf monarchies, and sponsoring terrorist plots around the world. 
  • Israel is now more likely to launch its own pre-emptive strike against Iran's nuclear facilities.  I don't believe that such a strike was ever remotely plausible.  The risks--of failure, of casualties, of a diplomatic backlash, of Iranian retaliation--are huge, and the likelihood of delivering a substantial setback to the Iranian program is negligible.  It's possible that the Netanyahu government sees things differently, but my guess is that if they did, they'd have launched a strike years ago.  More likely, the entire "do something or I'll be forced to act on my own!" charade was simply a ruse to get Western governments to impose sanctions.  If so, then it worked brilliantly--there's absolutely no way any sanctions, let alone the fairly substantial ones that were in effect until now, would have been imposed without this threat.  Unfortunately, the Obama administration has from the beginning hungered for reconciliation with Iran as part of its overall strategy of snubbing friends and courting enemies, the more effectively to undermine American power and influence abroad.  It was therefore something of a miracle that Netanyahu was ever able to huff and puff his way to a global sanctions regime, and probably inevitable that Obama would eventually find a way to dismantle it.
  • The deal dramatically improves Iran's strategic position.  In the short term, no doubt, the extra revenue that will accompany a lifting of sanctions will expand the Iranian regime's freedom of action.  But it still finds itself in dire straits in the medium term:  its most important satellite, Syria, is mired in an unbelievably bloody civil war that has already begun to spread to its second most important satellite, Lebanon.  The Saudis and their allies are certain to be even more determined than ever in their efforts to support the rebel factions in Syria and Lebanon--not to mention domestic dissidents within Iran itself.  The Iranian economy will be helped but not saved by the lifting of sanctions--it's still a corrupt quasi-command economy dominated by the leadership's relatives and cronies in the clergy and the Revolutionary Guard.  And a global oil and gas production boom is very likely to lead to a decline in oil prices in the near future, with consequences for Iranian government revenues that could easily end up dwarfing the recent sanctions in their severity.
So while I join the critics in condemning a deeply misguided and counterproductive deal, I don't think its consequences will be nearly as disastrous as many seem to fear.  Rather than Munich in 1938, I would compare this agreement with Vienna in 1979--also a reckless giveaway by a cluelessly naïve, American power-loathing president to a cunning, ruthless dictator with grand geopolitical ambitions and at the height of his power.  That deal, too, seemed to formalize and enhance the dictatorship's prestige, but in fact it coincided with their high-water mark, followed by an astonishingly rapid descent that concluded with the regime's complete dissolution within roughly a decade.  May this deal achieve the same result. 
     

Saturday, September 28, 2013

In 1990, a 22-year-old student named Christopher McCandless graduated from Emory University with a degree in history and anthropology.  He was in possession of a substantial trust fund, which his family hoped he would use to attend law school.  Instead, he donated the fund to charity, ceremonially burned his remaining cash, cut off all communication with his friends and family, and became a drifter, hitchhiking across the US, taking odd jobs, and renaming himself "Alexander Supertramp".  By late April 1992, he had found his way to Alaska, where he decided to head into the wilderness to live off the land, despite being woefully ill-equipped and completely lacking in wilderness survival skills.  He was dead of starvation by summer's end.

Viewed as a case study, this simple outline strongly suggests a tragic but depressingly familiar pattern:  the early stages of severe mental illness, symptoms of which typically appear during young adulthood, and can include identity confusion and compulsive, antisocial behavior.  But when a writer named Jon Krakauer somehow got ahold of McCandless' story--including a diary disjointedly documenting his wanderings--he found McCandless' professions of anti-materialism and alienation from society more inspiring than disturbed.  In tribute, Krakauer penned a long article on McCandless' fatal journey, which he later turned into a book called Into the Wild, the latter also inspiring a film adaptation directed by Sean Penn.  In both the book and film versions, McCandless is portrayed as a brilliant, idealistic young man disgusted with conventional society and eager to "find himself" through renunciation of material comforts, rejection of social mores and obligations, and solitary communion with nature.  Even his most erratic behaviors--that is, the ones that most strongly suggest mental disturbance--are portrayed rather as examples of his determination and purity of purpose.

Particularly prominent in Krakauer's telling of McCandless' tale is his theory that McCandless died of starvation not due to inability to fend for himself in the Alaskan wilderness, but rather because of the effects of some poisonous seeds he'd been eating, believing them to be safe.  An earlier hypothesis about the particular plant and toxin responsible having been proven incorrect, Krakauer has recently proposed an alternative culprit, and claims to have laboratory evidence supporting his new theory.

But Krakauer's obsession with the precise details of McCandless' death is misplaced.  It seems as if Krakauer believes that if he can only prove that McCandless died accidentally by ingesting the wrong seeds, then his entire thesis about McCandless being an inspirational hero rather than a deeply confused young man will be vindicated.  But let us suppose for a moment that Krakauer is correct, and McCandless had indeed succumbed to the toxins in some seeds he had eaten.  What would have happened if by some good fortune he'd managed to avoid those seeds?  One possibility is that he'd have continued his quest to survive in the Alaskan wilderness, and most likely subsequently perished in the harsh Alaskan winter, which he was completely unprepared to survive.  If there's a meaningful difference between that outcome and the actual course of events, I'm afraid it's lost on me.

Krakauer, on the other hand, apparently believes that "he probably would have walked out of the wild in late August",  and gone on to live a more or less normal life.  That's certainly possible--and would perhaps even have been probable, if McCandless had in fact been merely a naïve adventurer rather than a deeply troubled young man.  On the other hand, what would we make of McCandless' story--and Krakauer's reverent retelling of it--if it turned out, in the end, to be nothing but a rather reckless lark, a brief interlude of "wild man" survivalism in an otherwise unremarkable life story?

Krakauer's take on McCandless thus rests on a fundamental contradiction:  if McCandless' journey into the wild was a great and admirable quest, then we must acknowledge that it ended in abject failure, and draw the obvious conclusions.  On the other hand, if it was merely the tail end of a two-year spree of irresponsible youthful naivete, then why on earth would it merit Krakauer's heroic treatment?

Of course, it's not McCandless' death, accidental or otherwise, that truly inspires Krakauer's admiration.  Rather, it's McCandless' professed philosophy, which amounts to little more than a kind of obsessive worship of the self.  In his two years of wandering, McCandless shows no interest in any outward-directed higher purpose, such as helping humankind, increasing his understanding of the physical universe, or even connecting spiritually to a deity or deities.  His journey is a strictly internal voyage of self-discovery through isolation, contemplation, and rejection of all personal responsibility or obligation.  To most responsible adults, such a mission would not seem inspiring at all, but rather dull and self-indulgent (or, quite possibly, mentally unbalanced).

And that is why McCandless' alleged accidental death is so significant.  Had he succumbed to obvious reckless incompetence, or else lived to continue on his self-absorbed path, or even abandoned his life for a more conventional one, the pathological nature of his self-absorption would have been readily apparent.  By constructing a tragic accidental death for him, Krakauer was able to recast him as a heroic martyr to the religion of the self, where otherwise he would have been merely a living testimony to the falsity of its promise.

Sunday, July 28, 2013

In honor of the fiftieth anniversary of the Profumo affair, Mark Steyn has reposted his obituary of John Profumo from seven years ago.  Steyn adopts the conventional view that Profumo's sexual escapades and subsequent disgrace represented the decadent, dissipated face of the old British ruling class, while his acceptance of responsibility and personal penance (he spent the last forty years of his life as a volunteer for a charity) represented its more laudable side, reflecting its devotion to integrity and honor.

But there is a more cynical view of the affair:  that Profumo's fall, and subsequent refusal to even attempt to regain his former position, demonstrated first and foremost the British ruling class' utter enfeeblement, and foretold its complete surrender shortly thereafter to other contenders--organized labor, the bureaucracy, the intelligentsia, the professional class, the entrepreneurial/financial class--for domination of British society.  Sexual indulgence, after all, is hardly limited to aristocrats, and there has been no shortage of political sex scandals in the years since John Profumo's.  But members of a robust, confident elite don't simply lie down and accept disgrace, then wander off to clean toilets for a poorhouse for the rest of their lives, as Profumo did.  And indeed, numerous British public figures have survived greater or lesser embarrassments and lived on to contend in the corridors of power.  Profumo, however, was astute enough to recognize that his and his peers' (in both senses of the word) time had past, and that any attempted comeback would be futile.

In America, where elites have long been more dynamic than in the Old World, we see a similar pattern:  the strongest (the Kennedys, to take the most obvious example) are never tainted by personal scandal, however egregious their behavior; the strong (the Clintons) brazen it out, and emerge largely intact; the weak (Eliot Spitzer, Anthony Weiner) fall from power and must endure at least a period of disgrace before being allowed to attempt a comeback; and the weakest (John Edwards, Larry Craig) succumb, never to recover.

I have long held the opinion that in the political world, a "scandal" is best described as a kind of political contest:  an accusation is made against a politician by his or her political enemies, the consequences of which are determined primarily by the relative political power of the target and his or her enemies, largely irrespective of the actual severity or validity of the accusation.  This is particularly true of sex scandals, where "severity" is a highly subjective judgment that can easily be swayed by political sympathies.  John Profumo's indiscretion was fairly minor, even by the standards of his day.  But as an old-school upper-class traditionalist in a rapidly changing Britain, he had the misfortune to be politically vulnerable, and in the Darwinian world of democratic politics, even the tiniest of cuts will draw the predators to a sufficiently weakened prey.

Wednesday, July 17, 2013

One of my favorite Middle East journalists is Jonathan Spyer, an intrepid, clearheaded Israeli who has not only ventured behind rebel lines in Syria, but has returned to write surprisingly non-breathless dispatches about it.  I was therefore a bit surprised--but only a bit--by his puzzlement over the empirical fact that liberal democratic movements in the Arab world are consistently incapable of competing for political power with the two dominant forces in Arab politics:  militaries and Islamists:
“In the Middle East, it is the regimes or the Islamists; there is no third way.”...[B]ut I do not quite understand why. After all, the throngs of young people that we have witnessed in recent days in the streets of Egypt are not a mirage. No more were the young civil society activists who began the uprising in Syria, or the sophisticated liberals and reformers in Egypt. What are the factors which time and time again prevent the emergence of a muscular, representative, civilian and secular politics in the Arab world?
Spyer displays here a very common misunderstanding:  that "representative, civilian and secular politics" is naturally "muscular", unless suppressed by "factors" that "prevent" its "emergence".  In fact, it is not the absence of democracy that requires an explanation:  the very idea of democratic government--indeed, even the idea that governments ought to be accountable to their citizens, the principle on which elective, representative democracy is based--was simply unheard-of until a couple of centuries ago.  Until then--and even to this day, in many places--it was universally taken for granted that governments are, should be and always would be selected and maintained by force of might or apparent divine sanction, and their power to impose laws unlimited, except perhaps by greater might or holier divine sanction. 

Democracy is thus perhaps best thought of less as a political movement than as a kind of technology:  a collection of non-obvious principles, practices and processes that enable a society to impose accountability on its own government.  And like many technologies--say, the automobile--it needs both a broad supporting infrastructure and a widespread understanding of its use and maintenance in order to permeate a society.  One would never pause to wonder why automobiles weren't ubiquitous in those societies that haven't yet developed both the physical infrastructure to support them and the intellectual infrastructure to use and maintain them.  Likewise, one shouldn't be surprised at the lack of "muscular" democratic politics in countries--like those of the Arab world--that haven't yet created either the supporting institutions or the consensus understandings that make a working democracy possible. 

It is therefore heartening and necessary--though obviously far from sufficient--to see protestors overthrowing governments in places like Egypt, Tunisia and Syria.  Although such proto-democratic actions are far from authentic democracy--no doubt many of the protestors in those countries actually have little understanding of it, and even less sympathy for it--they nevertheless represent a slightly broadening public embrace of a weak version of the important democratic principle of popular sovereignty.  And while full-fledged democracy may be unlikely to blossom any time soon in those countries--the path from the French Revolution to representative democracy took nearly a century, after all--every such step brings them closer to the day when democratic ideas are as natural and obvious to their populations as they are to the citizens of Western democracies.

Sunday, March 17, 2013

Everything I know about life I learned from video games
  1. If you have a choice of playing life at the easy, moderate or difficult level, always play at the easiest level.
  2. Save often. That way, if you are killed or badly injured, you can always restart your life.
  3. If you see a switch anywhere, flick it; if you find a button, press it.
  4. If people are trying to kill you but you don't know where they are shooting from, just run out into the open and look around as they shoot at you. Then when you restart (see #2 above), you will know where they are.
  5. Pick up everything you can and keep it with  you.
  6. Read every book you can open.
  7. Just for fun, try shooting your friend in the head.  Probably nothing will happen, but in the worst case you can always restart (see #2 above).
  8. Always look around for ammunition, weapons, or health supplies that someone has dropped. Don't forget to look in the toilets.
  9. Don't forget to search every dead body you come across (see #8 above).
  10. If you happen to come across a strange machine that aliens left here eons ago, it shouldn't be too hard to figure out what it does and how it works (see #3 above).
  11. If you kill someone, hide his body. That way, he will never be missed.
  12.  
update:
Based on recent problems at the nuclear weapons plant Y-12, it appears that some of our security professionals haven't played enough video games.
 One [security problem] was relying on "pan-tilt-zoom" cameras that sweep back and forth, because a sophisticated adversary could learn their pattern and time an entry to avoid detection.
Actually, it seems that these Security Professionals haven't played any video games. So although they're not really very interesting or funny, let me add a couple of points that may actually be useful to these people.
  1. If you have to get by a sweeping camera or laser beam, wait till it sweeps out of the way and then run.
  2. If  #12 above doesn't work, try shooting out the camera. 
Of course, in some games shooting out the camera sets off an alarm and the bad guys start attacking you.  In the Y-12 game, however,
Some sites repair broken sensors and cameras within 24 hours; Y-12 set a window of 5 to 10 days, but that was only a goal, not a rule, the report said.
 There was no word on whether or not Y-12 stores its ammunition and first-aid equipment in the toilets.

update:
Everything's okay now.