Monday, December 24, 2007
After all, even in officially atheist Soviet contexts, there seemed to have been for a little seasonal whoopee amongst the striving for a more revolutionary (or at least a more futuristic) future.
(And while we're at it, here's a nice MarX-mas postcard from the International Institute for Social History.)
We're sorry for the light posting over the last few days, but we've been focused on the kind of writing that has some relevance in our professional lives.
Thus, we've been neglecting you. For that, we apologise.
And promise to improve.
But until then, we wish all our readers and friends a happy and healthy holiday season, whatever that holiday might be.
Tuesday, December 18, 2007
Whatever we want to call them, they can be unpleasant. (An “unpleasant” of post-structuralists?)
At a conference (about the concept of “veiling”, both in its literal and metaphorical senses), I found myself under relentless attack for my humble suggestion that any debate about this or related issues is meaningless if we don’t link cultural phenomena (or “discourses”) back to some material reality. More specifically, I suggested that we need to confront the question to what extent our bodies shape our senses of self long before culture kicks in.
This was triggered by the claim, made by a speaker who had spent twenty minutes jumping back and forth from Freud to Lacan to Derrida – with a juicy bit of violent Japanese porn thrown in for good measure – that “the wound”, too, is a type of veil which, when torn, reveals underneath it the epistemic void of which some of my colleagues seem to be so enamoured.
No, I said somewhat irately during the discussion, thinking of all the wounds that we get to see on a daily basis in the media and the reality that they quite clearly denote: if you tear open a wound there’s likely to be metres of intestine, pulsing and bleeding flesh and flabby fat underneath it. The body is not a text. It’s bloody, smelly and alive – and it hurts if you do that kind of thing to it.
Perhaps I was naïve in thinking that I could get away unscathed – not wounded, that is – in using the dreaded and indeed heretical b-word (biology) to support my point. Almost immediately I was set upon by a waif-like art historian in apparent anorectic denial of her body, who exclaimed:
“Show me the biological body!”
(The line comes across better in the original German, which combines the stilted politeness of the formal “you” with the exasperated condescendence of the unacknowledged avant-garde while introducing an undeniable (albeit probably unwanted) innuendo: “Dann zeigen Sie mir doch den biologischen Körper!”)
My healthy instincts of status-preservation prevented me from accepting her invitation in a literal sense. Though I must say I was tempted.
This exchange was followed by a kindly (or rather: friendly-aggressive) interception by another member of the auditorium, who patiently explained a “theoretical” position to me that I’d been aware of for at least fifteen years and in which, for a very brief period of my life, I also passionately adhered to:
“What we mean is that the body can only be accessed through the cultural codes that simultaneously describe and determine it.”
Or something along those lines, which I must have read umpteen times, in variations, over many years and might in fact have written myself at one point in my life – but whose heuristic significance veers towards zero.
Hearing these kind of statements only affirms my sneaking suspicion, shared by a number of sane colleagues, that the intellectual format of the humanities at present is deeply parochial and its presumably open-minded representatives intolerant at heart.
I mean: if all we can do is identify the famous codes that determine us down to our toenails, what’s the point of our work? Why waste our time at conferences slinging high-falutin’ jargon at each other to assert our status as alpha-theorists, when all this is just a discursive entanglement?
And, if that is the case, why was my discourse at this conference such an intolerable provocation?
But even more fundamentally: who makes those darn codes? Do they just hover around us like a nasty miasmic mist? Is there some gigantic computer programming them? Who programmes the computer? Who watches the Watchmen?
Open the pod-bay door, Hal!
Or might it be some intelligent designer, after all?
Still, I must say that, apart from being intellectually offended by such statements, I’m also personally hurt by being treated like a benighted moron requiring initiation into allegedly arcane realms of high theory that I’m entirely familiar with – especially if the people who feel they need to initiate me are intellectual fashionistas wearing the rags from three seasons ago.
(Comic credit: here)
Amongst other things, this breakaway sect asserts that mainstream Mormons were wrong to abandon the 'plural marriage' that its founder, Joseph Smith, had advocated. Krakauer's book is full of accounts of what that innocuous-sounding term conceals: in short, systematised oppression and child abuse spiced with healthy doses of religious insanity.
More recently, a woman who escaped from both her 'plural marriage' and the FLDS has published her insider's account of the cult. An excerpt from Escape by Carolyn Jessop appeared a few days ago in the Guardian, and it makes for enlightening -- and chilling -- reading.
It also contains a few elements of near-comical absurdity, particularly when Jessop explains the circumstances of how the sect's 'Prophet' had arranged her marriage at age 18 to a man more than 30 years her senior:
I later discovered that Merril had married into my family only to stop my father suing him over a business deal that had gone sour. More humiliating still, he hadn't meant to marry me, but my younger and prettier sister, Annette. When he asked the Prophet to arrange the marriage, Merril got our names mixed up.So much for divine omniscience.
There is much that is worrisome in Jessop's narrative -- such as, for instance, the apparent fact that the local police are cult members themselves. But I think the most disturbing element in the story -- and this was something I felt about Krakauer's book too -- was the key role played by women in maintaining the cult's twisted ideology. Jessop describes being indoctrinated by her grandmother:
And the reality of relationships among the 'sister-wives' is, unsurprisingly, a ruthless one:
I had been blessed, Grandma taught me, to come into a family where generations of women had sacrificed their feelings to preserve the work of God. My sole purpose on earth, she explained, was to have as many children as possible. I would not fall in love and choose my husband like gentile women did; instead, God would reveal him through the leader of our community, the man we called the Prophet. [...]
Because I loved her so much and this was presented to me as absolute truth, it would be years before I would flee my so-called destiny.
Men were supposed to treat their wives equally, but everyone knew that a woman who was in sexual favour with her husband had a higher value than the others. Because she had his ear, she would be treated with respect by his other wives and her stepchildren. She might be exempted from physical labour or other family responsibilities. She could make sure that the wives she disliked were assigned the worst jobs. A woman who no longer satisfied her husband, meanwhile, was on dangerous ground. No wonder that when a new wife entered a family, her priority was usually to establish power with her husband sexually.
Merril tried to keep all of his wives pregnant because it suggested he had an equal relationship with each of us. But he was a polygamist in body, a monogamist in soul. Barbara was the only woman he ever loved. She took full advantage of it to dictate every detail of her sister wives' lives, right down to our diet.
Barbara and many other of the women in Jessop's and Krakauer's books seem to have little doubt that their severely restricted lives are divinely ordained. This belief not only ennobles their very apparent suffering but it also ensures that their energy is channelled primarily into struggles with (and attempts to dominate) other women.
This aspect of the story reminded me of something Catherine Bennett wrote at the beginning of November, on the occasion of the Saudi king's official visit to Britain.
With the advance of young British veil wearers, proudly declaring their right to be invisible and their love of extreme modesty, this and many other forms of faith-related female subjugation have become complicated areas for liberal protest. If, as we're often told, many British Muslim women love their jilbabs, how can we be sure Saudi women do not also rejoice in their coverings, accepting, in the same dutiful spirit, total exclusion from civic life and physical chastisement by their devout partners? How can we be sure their would-be liberators are not - like women who adorn themselves and women who cut their hair short - just a few more Women Who Will Go to Hell?I have no doubt that many (perhaps most) of the women remaining among the FLDS view Carolyn Jessop as just such a Woman Who Will Go to Hell. Is this surprising? Not really: as in practices such as female genital cutting or 'honour killing', women appear to very often be complicit in the oppression of other women (and themselves).
I can almost see Bennett's point about how this makes protest 'complicated' for liberals.
Because, surely, it doesn't make it all that complicated, does it?
Monday, December 17, 2007
There are various arguments in favour of allowing the civilian possession of firearms (some of which are rather good ones).
However, in the American case, I've always found the constitutional one to be the weakest. It is based upon the US Constitution's Second Amendment:
A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.
Whether taken literally or placed within in its historical context, I have always thought that the Amendment deals not with the possession of weapons for any reason, but only in terms of supporting a well regulated militia.
This is not to say that weapon possession for other reasons should necessarily be prohibited, but it does suggest that any 'right' derived from the Second Amendment is limited to a specific purpose.
(An extensive argument against this position can be found here.)
In any case, the Second Amendment's meaning is being revisited by the U.S. Supreme Court as it considers a lower court's decision to strike down Washington D.C.'s strict gun law.
And, apparently, one of the key issues in this case will be...comma placement.
In 'Clause and Effect', Adam Freedman considers the issue, pointing out the argument of the judge who sought to overturn the D.C. ban:
A brief prepared by a group that agrees with the judge's decision has made a similar point:
The decision ... cites the second comma (the one after “state”) as proof that the Second Amendment does not merely protect the “collective” right of states to maintain their militias, but endows each citizen with an “individual” right to carry a gun, regardless of membership in the local militia.
How does a mere comma do that? According to the court, the second comma divides the amendment into two clauses: one “prefatory” and the other “operative.” On this reading, the bit about a well-regulated militia is just preliminary throat clearing; the framers don’t really get down to business until they start talking about “the right of the people ... shall not be infringed.”
Nelson Lund, a professor of law at George Mason University, argues that everything before the second comma is an “absolute phrase” and, therefore, does not modify anything in the main clause. Professor Lund states that the Second Amendment “has exactly the same meaning that it would have if the preamble had been omitted.”In his essay, Freedman looks at the issue of 18th century comma usage and, by way of a little Latin, reaches the following -- fully sensible -- conclusion:
Not that arguments about punctuation are going to convince anyone who already has a strong view on this matter. But still, it's a well-written (and punctuated) essay.
The best way to make sense of the Second Amendment is to take away all the commas (which, I know, means that only outlaws will have commas). Without the distracting commas, one can focus on the grammar of the sentence. Professor Lund is correct that the clause about a well-regulated militia is “absolute,” but only in the sense that it is grammatically independent of the main clause, not that it is logically unrelated. To the contrary, absolute clauses typically provide a causal or temporal context for the main clause.
The founders — most of whom were classically educated — would have recognized this rhetorical device as the “ablative absolute” of Latin prose. To take an example from Horace likely to have been familiar to them: “Caesar, being in command of the earth, I fear neither civil war nor death by violence” (ego nec tumultum nec mori per vim metuam, tenente Caesare terras). The main clause flows logically from the absolute clause: “Because Caesar commands the earth, I fear neither civil war nor death by violence.”
Likewise, when the justices finish diagramming the Second Amendment, they should end up with something that expresses a causal link, like: “Because a well regulated militia is necessary to the security of a free state, the right of the people to keep and bear Arms shall not be infringed.” In other words, the amendment is really about protecting militias, notwithstanding the originalist arguments to the contrary.
Personally, my favourite collision between politics and punctuation must be Whig MP Richard Brinsley Sheridan's 'apology' to one of his colleagues in the House of Parliament:
"Mr. Speaker, I said the honourable member was a liar it is true and I am sorry for it. The honourable gentleman may place the punctuation where he pleases."
(Courtesy: Eigen's Political & Historical Quotations.)
Meanwhile, the 'word of the year' in Germany (as chosen by the Society for the German Language) was 'climate catastrophe' (Klimakatastrophe).
And, in the German-speaking part of Switzerland, it was 'death tourism' (Sterbetourismus).
As I've written, it may be that the Germans have a real problem with happiness, at least in its more w00tful varieties. Of course, this year's favourite words might just be a fluke.
Interesting runner-up in Switzerland: 'Taschenmunition' (literally 'pocket ammunition', or, the ammunition that each Swiss soldier must keep at home).
Bizarre runner-up on the Webster list: 'sputum'.
(Via Atlantic Review)
Saturday, December 15, 2007
This struck me first on Tuesday, while I was perusing a copy of the Times that someone had abandoned in the Eurostar. Strikingly titled 'Why the human race is growing apart', it quotes one of the researchers:
“Human races are evolving away from each other,” said Henry Harpending, Professor of Anthropology at the University of Utah, who led the study.
“Genes are evolving fast in Europe, Asia and Africa, but almost all of these are unique to their continent of origin. We are getting less alike, not merging into a single, mixed humanity.
“Our study denies the widely held assumption that modern humans appeared 40,000 years ago, have not changed since and that we are all pretty much the same. We aren’t the same as people even 1,000 or 2,000 years ago.”
Hmm, I thought.
Having spent a fair amount of effort trying to come to grips with the connections among evolution, psychology and history, this gave me much to think about. This is particularly so as I have been largely convinced by the argument of evolutionary psychologists that human nature -- while not completely unchanged in the last dozen millennia -- remains shared enough to speak of the 'psychic unity' of Homo sapiens.
(I discussed this in an article published earlier this year. There were two responses -- by Martin J. Wiener and Barbara H. Rosenwein -- to that essay, and they, along with my response-to-the-responses has just appeared in Cultural and Social History.)
In any case, over the next couple of days, I received a few helpful e-mails from friends who know about my interest in such things, pointing me to other stories on the study. One of them came from a fellow historian who has become quite enthusiastic about the notion of 'recent' biological change influencing behaviour. He sent me a link to the Los Angeles Times report on the study, which opens:
The pace of human evolution has been increasing at a stunning rate since our ancestors began spreading through Europe, Asia and Africa 40,000 years ago, quickening to 100 times historical levels after agriculture became widespread, according to a study published today.Hmm, I thought again, sitting in a London internet café and having relatively little time to do any follow-up.
I finally had a chance today to examine another story on the study, in the New York Times.
It opens with this...
The finding contradicts a widely held assumption that human evolution came to a halt 10,000 years ago or even 50,000 years ago. Some evolutionary psychologists, for example, assume that the mind has not evolved since the Ice Age ended 10,000 years ago....which is a rather curious statement, since I can't think of any reputable biologically-aware researcher in any field who thought that 'human evolution came to a halt' at some point in the past.
The argument evolutionary psychologists make tends to be one that -- while there has undoubtedly been 'recent' genetic change (lactose tolerance and disease resistance being prime examples) -- the relative influence of these changes compared to those during the much longer Pleistocene is probably minimal.
The breathless quality of the reporting on this is also somewhat odd, as the 'acceleration' doesn't seem all that surprising: since human populations became significantly larger, there will be more mutations on which natural selection can operate; as human populations were also inhabiting more diverse environments, this could have subjected some populations to new selective pressures. As the LA Times piece notes,
the research team was able to conclude that infectious diseases and the introduction of new foods were the primary reasons that some genes swept through populations with such speed.I might be missing something, but even my non-expert understanding of evolution would lead me to expect exactly this result.
So, in some way, I'm not so sure what the fuss is about.
But while it is clear that different environments' variations (in plant-life, sunlight, disease prevalence) might lead quite directly to 'rapid' change (remember, we're still talking vast amounts of time here), it is not clear to me why psychological change would be quite so rapid. The key 'environment' in terms of psychologically-relevant genetic change would be other people, and, by and large, it seems that on this point, the differences among different regional populations would not necessarily push human development consistently enough in one direction or another.
Most of the articles I've seen on the 'accelerating evolution' issue haven't really discussed a psychological angle on this, but it certainly is lurking there, particularly in the wake of James Watson's comments regarding intelligence and Africa. Harpending, moreover, was also co-author of a study claiming a recently-acquired genetic basis for high levels of intelligence among Ashkenazi Jews.
The New York Times raises cautions that some other articles missed in their apparent enthusiasm to proclaim significant genetic change in recent historical eras:
As it should. But as the NYT piece also points out, the methodology used cannot firmly establish what happened in the last 10,000 years or so.
David Reich, a population geneticist at the Harvard Medical School, said the new report was “a very interesting and exciting hypothesis” but that the authors had not ruled out other explanations of the data. The power of their test for selected genes falls off in looking both at more ancient and more recent events, he said, so the overall picture might not be correct.
Similar reservations were expressed by Jonathan Pritchard, a population geneticist at the University of Chicago.
“My feeling is that they haven’t been cautious enough,” he said. “This paper will probably stimulate others to study this question.”
The high rate of selection has probably continued to the present day, Dr. Moyzis said, but current data are not adequate to pick up recent selection.
(This point is also made by the graph included with the article.) Which makes all of the speculation about changes within the last thousand years or so a bit more...well, speculative than they sounded in the other articles.
This whole discussion inspired me to revisit Gregory Clark's argument in A Farewell to Alms, which I commented upon at length a while ago. (Part one. Part two.)
One of Clark's key arguments (simplifying somewhat) suggests (though rather vaguely) that rapid economic development in England in the early modern period was significantly influenced by genetic predispositions toward bourgeois values that were transmitted through English society by the fact that the wealthy out-bred the poor.
Via Clark's website, I found a couple of very readable critiques of his work. (There are a wide range of opinions on his work on offer there, and Clark deserves some credit for bringing them together. Of course, no publicity is bad publicity...)
In 'The Son Also Rises' (pdf) at Evolutionary Psychology Laura Betzig points out the weakness in Clark's argument that enhanced reproductive success by wealthy English led to an economic advantage due to the spread of middle-class values. First, the English rich were far from unique in this regard:
Clark knows that civilization began thousands of years ago, somewhere around Babylon; and he devotes a full chapter to the question, “Why England? Why not China, India, or Japan?” Why weren’t the Near East and Far East the best candidates for the natural selection of a hard-working middle-class? Because, he says, civilization in and around Babylon was more “unstable” than in Britain; and because in China and Japan—it pains me even to type these words—“the demographic system in both these societies gave less reproductive advantage to the wealthy than in England.” Clark cites evidence that Qing emperors fathered only as many children as average Englishmen living at around the same time (pp. 89, 209, 271, Figure 13.4). But of course for Qing emperors, as for any other emperors, legitimate fertility was low: Chinese emperors, like Assyrian emperors, like all other emperors got heirs on just one empress at a time, their legitimate wives; but they got bastards on scores, or hundreds, of consorts. Who should have transmitted, if not the high ethical standards of their bastards’ fathers, at least their hard-working genes. So much for the evidence in A Farewell to Alms.Wealth, status and reproductive success may -- unsurprisingly -- correlate, but the extent to which the 'values' associated with that success can be passed on genetically is another matter, as Betzig points out:
And, it goes without saying, for those values to provide any benefit, they have to be possessed in a society that rewards them, turning us to various 'institutional' factors (from cultural assumptions, religious beliefs, social organisation, political structures, etc.) that Clark so blithely dismisses.
There are other gaps in the logic. I am aware of dozens of studies that show a relationship between reproductive success and wealth or rank...; but I’m aware of no study that shows a correlation between reproductive success and the “middle-class values” of patience, nonviolence, literacy, thoughtfulness, or hard work.
It is also worth pointing out that even in relatively recent history, those 'values' have not been the only routes to success. As I commented before on an earlier paper by Clark, concerning the period 1250-1800 with regard to the available means of getting ahead in life:
In her detailed critique of A Farewell to Alms, Dierdre McCloskey, I see, makes a similar observation. She divides Clark's argument into various elements and 'links', and as part of that discussion notes:
One might be granted a peerage, for instance, for reasons that had little to do with capitalist success and rather more to do with simply being on the right side of a political squabble, making a good marriage or having success in war. (Just as losing one's wealth might have had to do with contingencies related to the above factors.)
'Wealth' and 'success' were being amassed in England in a variety of ways: slave trading, tobacco planting, empire building, monarch-bribing, textile weaving and goods trading (not to mention, at least in the earlier period, being good with a sword).
Which gene is it, precisely, that is going to promote success in all these different ways of getting ahead in life?
In light of Clark’s methodological convictions...the most embarrassing broken link is A, between “Rich breed more” and “Rich people’s values spread.” Nowhere in the book does Clark calculate what higher breeding rates could have accomplished by way of rhetorical change. It could easily be done, at any rate under his mechanical assumption about how the social construction of values works. Clark assumes that the children of rich people are by that fact carriers of the sort of bourgeois values that make for an Industrial Revolution.McCloskey's critique is worth reading, even if it's flawed. (She doesn't seem to know what a 'meme' is, for example: she sees Clark as promoting a 'meme' theory, but memetics actually tends to separate biological and cultural development. Also, she curiously lumps Clark in with Steven Pinker whose arguments tend to be both much better and much different than Clark's.)
To be sure, this is an odd characterization of the medieval or early modern relatively rich. A rich bourgeois of London in 1400 devoted most of his effort to arranging special protection for his wool-trading monopoly. His younger sons might well have taken away the lesson, repeated again and again down to Elizabethan England and Lou Dobbs, that it’s a good idea to regulate everything you can, and quite a bad thing to let people freely make the deals they wish to make. And a Brave Sir Botany who had stolen his riches, say, or was a successful courtier who had received them from Henry VIII dissolving monasteries, say, would not automatically, one would think, transmit sober bourgeois values to younger sons. A society that extravagantly admired aristocratic or Christian virtues could corrupt even a Medici banker into thinking of himself as quite the lord and yet also a godly son of the Church. In a similar way nowadays an extravagant admiration for the neo-aristocratic values of the clerisy corrupts the bourgeois daughter into scorning her father’s bourgeois occupation.
To be honest, I'm not really sure what to make of all this. It's not my field, and I'm in the position of anyone who wants to make use of the insights from another field: you can educate yourself on it to the extent that your time and interest allows, but in the end, you depend on particular experts who are able to translate those findings in a language that is generally comprehensible. Unfortunately, those experts always seem to be at war with other experts in their fields.
But language and the history of human behaviour are two things I know a little bit about, which is perhaps why I don't like so much the leap from the molecular level to vast statements about how 'human races' are 'evolving away from one another'. (In particular when they get picked up in the press, where they are presented far less carefully than they should be.)
What does this mean? How relevant are these differences to determining what makes us human?
I have the sense that the overwhelming amount of what we share is being drowned out in the emphasis to focus on difference, as interesting as those might be. Both perspectives are not, of course inherently mutually exclusive: evolutionary psychology's emphasis on a common human nature would seem to admit some variation, and the differences found by population genetics are only meaningful in relation to that commonality.
At a certain level of specificity, the genetic differences seem vast, but step back a bit and they're relatively less so. People in different regions might tend to have different levels of disease resistance, say, but their immune systems all work the same way. Which of these observations is more important in understanding what makes us human? I would say the latter (while admitting that understanding the former is important for many reasons).
When Harpending (quoted above) says that 'we' are 'not the same' as people even 1,000 years ago, what does that mean? I am, for instance, not the same as you (whoever you -- dear reader -- might be). But that appears to be a pretty meaningless statement on the face of it. Were people 1,000 years ago different in any fundamental way than today? If so, what? It seems rather premature to make claims like that based only on counting changes in particular allelles without actually knowing -- in nearly all those cases -- what effect those changes have (on, for instance, behaviour).
But perhaps more significantly, it's unfortunate to see what could be a subtle and complex synthesis of different perspectives and levels of analysis of our humanness (shared across many fields) being shunted aside by a set of competing one-dimensional, either-or answers to why things work the way they do: institutions vs. genes, materialism vs. culturalism, similarity vs. difference, universality vs. particularity.
Maybe this is because academic work is partly driven by the very normal (indeed, human) cravings for attention and status (and maybe our innate tendency toward tribalism) which seem to affect every other human enterprise
I just don't think that's necessary a good thing.
A draft of the original paper (which is very technical) is available here.
Discussion and criticism of the article (also quite technical) can be found at Gene Expression.
There is also a long article from the Economist on the study is also freely available.
Monday, December 10, 2007
First, a photo series from Der Spiegel, featuring a calendar made by doctors from a cancer research institute in Naples. (Click on the block of images about halfway down the article to see the photos.)
The interesting feature: the doctors appear in their underwear. It's far from being the sexiest calendar you could imagine (as the following photo suggests), but it certainly has a charm all its own. And is for a good cause.
Second, the first film showing the long-eared jerboa living in the wild (in fact, the Gobi Desert) have made for mirthful viewing. Of course, there is more than a twinge of sadness involved, as they are also endangered (though rather less so than Italian doctors...though about as furry).
Still...if you're looking for a last-minute Christmas gift, I can think of a lot worse.
The site from which this adorable little picture comes (I know, I know, we can't just save the cute animals, but if you're looking for a poster child you're not going to choose something scaly or worm-like, are you?) is that of Edge (no, not that Edge, this one stands for Evolutionarily Distinct, Globally Endangered).
They seem like good people.
Till the weekend...
Friday, December 07, 2007
Not least since 'Ballard' was translated as 'Ball pool of broadcasting corporations' ('ARD' being the name for German state television), and bizarrely poetic text was generated such as:
Were not the dictators 20 in vain. Century fascinates in such a way of it.and
The modernism lets dark impulses come upward, those into us schlummern. It gives no area to the unexplainable one, the Mysterioesen - and produces straight therefore a return to the Barbarei.What my version lacks in poetry, is hopefully made up for in clarity.
(Thanks Simon. Thanks Alexander.)
'Bringing the Past to Heel: History, Identity and Violence in Ian McEwan's Black Dogs', Literature and History, 16, no. 2 (Autumn 2007): pp. 43-56.
We plan on it not being the last.
What's it about? Well, according to the abstract, it goes something like this:
Ian McEwan's 1992 novel employs postmodern understandings of history while also critiquing these same perspectives. In particular, by depicting the efforts of its protagonist, Jeremy, to write a memoir of his parents-in-law, it draws attention to the subjectivity of historical writing. While this quality has led some critics to condemn the novel for its escapism and amorality, the authors of the essay argue that Black Dogs is a statement about the necessity of history rather than its futility. Indeed, they read the text as a dramatization of humanity's ability to bear rather than escape the often troubling burden of the past and an endorsement of the writing of history despite the awareness that historiography, while serving deep-seated human needs, is always problematic.This essay has been some time in the making (and in the publishing...), but since we worked on it, I have to say that my appreciation for the novel has only increased. It is, I think, one of McEwan's lesser-known ones.
It shouldn't be.
Tuesday, December 04, 2007
In it, Obscene Desserts idol Robyn Hitchcock (beloved not least since he sings songs about trilobites and various other, as Leviticus puts it, 'creeping thing[s] that creepeth upon the earth') comments on his own Led Zep connection.
Which is an intriguing one: apparently, John Paul Jones has been touring with Robyn (at least in Italy, Norway and, um, Dorset) and playing the mandolin.
'These days,' Robyn observes, 'his musical compass points to bluegrass.' Who'd have guessed?
And he offers a nice anecdote:
A lesson in true humility were there ever one.
Like true Englishmen, we eyed each other suspiciously at first, but, after about five years, rang and arranged to meet for coffee. Hot drinks being the catalyst they are, within another two years we flying, economy class, to Italy to perform as an acoustic duo.Alitalia having missed our connections and lost our gear, John looked surprisingly unwistful when a fan asked him to sign a photo of Led Zeppelin standing in front of their private jet.
For the somewhat brightened outlook, I have also to thank Ario, who offered a lovely Nina Simone song in response to my apocalyptic musings.
And then, last but not least, there's the American intelligence community, whose new National Intelligence Estimate -- while not entirely reassuring -- appears to be somewhat less immediately distressing than recent statements from the US government. As the New York Times reports:
The report, no doubt, will be bickered over in the usual noisy way, and what it all adds up to is, as ever, a bit murky. (And are these not the agencies -- at least some of them -- who dropped the ball on predicting the collapse of the Soviet Union, preventing September 11th and proving the presence -- or lack thereof -- of Iraqi WMD?)
Iran is continuing to produce enriched uranium, a program that the Tehran government has said is intended for civilian purposes. The new estimate says that the enrichment program could still provide Iran with enough raw material to produce a nuclear weapon sometime by the middle of next decade, a timetable essentially unchanged from previous estimates.
But the new report essentially disavows a judgment that the intelligence agencies issued in 2005, which concluded that Iran had an active secret arms program intended to transform the raw material into a nuclear weapon. The new estimate declares instead with “high confidence” that the military-run program was shut in 2003, and it concludes with “moderate confidence” that the program remains frozen. The report judges that the halt was imposed by Iran “primarily in response to increasing international scrutiny and pressure.”
Now, it's far from prophesying an imminent Age of Aquarius, but the fact that 'international scrutiny and pressure' has had an effect is encouraging. (And a good argument for more of the same.)
Apocalypse delayed, then.
'Tis the season to be jolly.
Sunday, December 02, 2007
These are indeed good times for bad times, and the sheer variety and seriousness of crises (whether emergent or already hatched) seem -- rather like extreme weather, CO2 emissions or nuclear proliferation -- to be accelerating at a dizzying rate.
As is the amount of discussion about them.
This is hardly an, ahem, earth-shattering observation; however, I thought I'd take a moment on this rather gloomy Sunday (here where we are) to point out a few worthwhile stops on the apocalyptic reading express.
If the recent IPCC report wasn't intimidating enough, Spiegel points to more recent research that, if anything, suggests it may have been overly optimistic in its conclusions.
From a somewhat different but most likely not unrelated angle, Tom Englehardt, in 'As the World Burns', raises some very good questions and suggests that 'peak water' might be just as serious a problem as peak oil.
Not only do we learn the intriguing fact that last month the governor of an American state led a crowd gathered in one of that nation's leading cities to pray for rain (there's faith-based policy making taken to its illogical conclusion), but Englehardt provides a lot of very useful links to some of the rare serious reporting on the big picture of water scarcity.
The problems in question, as he points out, are not only confined to developing countries (thought these, of course, will most likely suffer more, having fewer opportunities for reacting to shortages), but are also a serious (and possibly urgent) issue in more economically advanced countries.
"Resource wars" are things that happen elsewhere. We don't usually think of our country as water poor or imagine that "resource wars" might be applied as a description to various state and local governments in the southwest, southeast, or upper Midwest now fighting tooth and nail for previously shared water. And yet, "war" may not be a bad metaphor for what's on the horizon. According to the National Climate Data Center, federal officials have declared 43% of the contiguous U.S. to be in "moderate to extreme drought." Already, Sonny Perdue of Georgia is embroiled in an ever more bitter conflict - a "water war," as the headlines say - with the governors of Florida and Alabama, as well as the Army Corps of Engineers, over the flow of water into and out of the Atlanta area.
In my own admittedly limited search of the mainstream, I found only one vivid, thoughtful recent piece on this subject: "The Future Is Drying Up," by Jon Gertner, written for the New York Times Magazine. It focused on the southwestern drought and began to explore some of the "and thens," as in this brief passage on Colorado in which Gertner quotes Roger Pulwarty, a "highly regarded climatologist" at the National Oceanic and Atmospheric Administration:
"The worst outcome?. would be mass migrations out of the region, along with bitter interstate court battles over the dwindling water supplies. But well before that, if too much water is siphoned from agriculture,farm towns and ranch towns will wither. Meanwhile, Colorado's largest industry, tourism, might collapse if river flows became a trickle during summertime."
Mass migrations, exfiltrations?. Stop a sec and take in that possibility and what exactly it might mean. After all, we do have some small idea, having, in recent years, lost one American city, New Orleans, at least temporarily.
Yes, and that 'small idea' is hardly encouraging, is it?
But the thought that has been preoccupying me a bit more recently is that we may actually lucky if the multifarious results of global warming are the worst thing with which we have to grapple.
Ron Rosenbaum, in 'Talkin' World War III' has done a nice job of rekindling all those nuclear war nightmares that used to keep me up at night as a teenager in the 1980s. There is part of me that thinks that at least a portion of the fear of the 'Islamic bomb' is sabre-rattling hype aimed at creating the kind of environment in which armchair strategists can enthuse publicly about the invasion of Iran -- or the US occupation of Islamabad -- and be seen as thoughtful policymakers.
Then there is the other part that realises that this issue -- while one that can be used for all kinds of wrong reasons -- is not something merely cooked up by the Global Neo-Con Conspiracy but is also a genuine threat. I don't have all that much trust in the wise use of the American bomb (or the British and French ones for that matter) let alone those belonging to Russia and China. But, as Rosenbaum points out, things are in many ways worse than in the bad old Cold War days (which were bad enough):
Let's pause here for a bit of comparative nightmare-ology. Not to diminish the horror of a "next 9/11," but 3,000 died that day. At the height of the Cold War, the estimate for the number of killed in a U.S.-USSR nuclear war ranged from a low of 200 million to a high of everyone, the death of the human species from an Earth made uninhabitable by nuclear winter. Or, as one nuclear strategist once memorably put it, "the death of consciousness."I can't say whether Rosenbaum's evaluation is overly pessimistic or not. Nonetheless, I can't really see that it's anything but fundamentally correct in identifying the problem.
It didn't happen back then, in part, we now know, because of blind luck (misleading radar warnings on both sides that could have been, but weren't, taken as signals for launch). And because back then, despite the madness of Mutually Assured Destruction deterrence doctrine, there were only two main players, both semirational monoliths with an interest in their own survival.
Now, there are at least eight nuclear nations and who knows how many "nonstate actors," as the euphemism for terrorist groups goes. And some of these nonstate actors have adopted an ideology of suicidal martyrdom, even when it comes to nukes, and thus can't be deterred by the reciprocal threat of death.
The solution, of course, is something else. And, much as with global warming, I'm rather sceptical that there is a realistic one that doesn't have serious downsides.
Unless you think that the world is going to display the kind of cooperation and self-denial that, at least to my eyes, been less than common in its history.
This is shaping up to be a very fun century.