Monday, July 10, 2023

On Reading History: Hannah Arendt vs. The 1619 Project


I've been reading a bit about the 1619 Project. Slavery and its centrality to American colonial life do get short shrift in American history as usually taught, so it's cool to have something that focusses attention on that central factor.

I'm interested in the perspective with which we read stuff like this, though. A LOT of people seem to see the 1619 Project as some sort of essential corrective to the American history as just a hero story.

But I went to Philadelphia Public Schools in the 70s and 80s--hardly a hotbed of cutting-edge pedagogy-- and we weren't taught the straight-up hero story. Remember the Aztecs and Incas and the conquistadors? Remember the triangle trade? Remember the struggles over slavery and how to count slaves at the Constitutional Convention? Remember the Trail of Tears? Remember this all leads up to the Civil War? (All, by the way, included in my middle-school-and-earlier history classes.)

Clearly not, in many cases. But not because these things were never taught, I bet, but because you weren't paying attention.

The 1619 Project is not so much an antidote to previous attempts to teach you history so much as to your imperviousness to them. When history has a whiff of radical chic, NOW you pay attention.

It is interesting to me, too, HOW we seem to absorb the 1619 project. It's all about our judging the past by our present day criteria. And that, certainly, is part of what history should be about. 

But when that all we do, it really devalues history itself. We can look for exemplars and moral tales and things to be proud of and guilty about everywhere. We can make them up (that's called fiction, with exemplary honesty).

If we let go of the absolute primacy of ourselves and our present-day moral and ontological crises for a bit, we can reap other, I'd say greater rewards from history. One of them being, to a degree, appreciating someone else's, radically different, perspective.

1619 doesn't arise out of 2023. People living and acting in 1619 had absolutely no notion of us or the perspective we bring to their stories. What they knew was 1619 and before.


To us, 1619 is important because of the lessons it teaches about who we are, where we come from and how we failed to live up to our ideals from go. To those is 1619, we don't figure into the equation at all. And there was absolutely nothing new about failure to live up to ideals. We'd always been doing that. All of us.

What are ideals for but to fail at them?

And the barbarity of colonization was nothing new either. To anyone. Very few people welcomed the conquering Aztecs to town. The Mongols weren't particularly nice imperial masters. The crusaders were hardly justice warriors so much as savage pirates. The Barbary Corsairs were, literally, pirates and slavers. Rome wasn't a good neighbor to have. Or China. Or the Songhai empire.

Taking whatever you could wrest from someone else; slaughtering people whose presence was inconvenient to your purposes or who just happened to be there when you were in a mood for slaughter; rape; wanton destruction and enslavement were not everyday things in 1619, but certainly they defined the horizon of awful things that could happen to you. And they actually happened with a fair degree of regularity.


None of that--theft, rape, murder, slavery, vandalism, genocide--was new to America in 1619. What was innovatory was the scale on which these things could now be done with a slowly industrializing Europe on one side of the Atlantic and three newly-within-easy-reach continents ripe for ruthless exploitation. *That* situation was relatively new.

The importance of looking at history like America's history with slavery and, later, racial exploitation and hatred is just for us to observe how far we fell from our present-day standards. Or even how far we fell from contemporary standards. But also to see the degree to which these failures were really just another day at the office for human kind. These failures are a big part of what we do and have always done. Except perhaps in 1619 we had upgraded from mimeograph to a Xerox machine.

Years ago, reading Hannah Arendt's The Human Condition, I was struck by how matter-of-factly she asserted that in classical times everything that we would call "civilization"--philosophical contemplation, great speeches, debate about matters of public interest, great public deeds, the writing of history--were all built on the foundation of excess wealth created through the ruthless exploitation of slaves and women. 

Civilization requires economic excess and the only real place to get it before modernity was by exploiting the labor of other people. And the greater footprint the civilization left, either in terms of archaeological or intellectual remains, the greater the exploitation.

The development, dissemination and mass inculcation of the very ideals that we still so fabulously fail to live up to were all done through usually ruthless exploitation.

Now I say this not to let us all off the hook of trying to live well through our post-colonial legacy in the West. We still have to do that, best we can. But one of the essential contexts to see 1619 in is pre-1619. 1619 makes a convenient beginning for a story we are still living through today, but it was also the continuation of a story that goes back much further than that, a story at least as ugly and one that implicates us even more so than the one beginning in 1619. A story that we also have to find ways to internalize.

Thursday, May 29, 2014

The Enduring Stupidity of American Foreign Policy Discussion

President Obama recently gave what was hyped as a "major foreign policy" statement before the graduating class at West Point this past week. Given the strong realistic streak in Obama's overseas maneuvering to date, I thought that this would either be a huge redefinition of terms or a complete and utter non-event. It turned out to be the latter.

This has given a lot of folks who are otherwise critical of Obama, both left and right, a chance to be further critical of him. And it gives me the chance to point out that it was pretty stupid to set himself up for this fall.

Unless somehow his message of extreme moderation somehow slips through the chorus of mostly nonsensical caterwauling and reaches the public. It is possible.

What amazes me about foreign policy discussion in this country is the extent to which it is dominated by people who utter little that isn't complete & utter bullshit. I've written before (mostly here and here) about the essentially theatrical attitude many purportedly expert commentators take toward foreign policy; where foreign policy is all about cutting a figure, creating an image that is pleasing to the American public (and, allegedly, one that is terrifying to our enemies), whatever the reality behind it.

This is essentially a cynical attitude: Americans know little about the rest of the world and don't care to know very much. The *reality* of our involvement vis-a-vis the rest of the world is not something they are likely to become cognizant of, at least in the short term. The image of a forthright and defiant American President, gazing off into the middle distance with a light wind rippling his hair. That's something that'll reach them. Albeit briefly.

Obama's hairstyle is not the only thing that makes him ill-suited for this role. He had his chance to rework this image after his own predilections after he was elected. He had, after all, did a bang up job of reworking our image of presidential candidate over the prior year. But there was a pesky financial crisis and ensuing recession to deal with, and he always seemed a bit . . . mmm . . . ill at ease with the purely theatrical elements of politics. His middle-distance stare always had an element of "I wish I were anywhere but here" to it, rather than the "I really LOVE me, and so should you!" or "They really LOVE me, and so should you!" that we see in leaders who have really mastered the art.

At heart, Obama is a tinkerer and improver. A realist. A man who hates stupidity and waste and who has had plenty to do just unwinding the commitments left to him by the prior administration, which came to be defined by stupidity and waste.

Translating what he really feels about foreign policy--that he doesn't hold with any religious mission at the core of American foreign policy, that he doesn't hold with fomenting paranoia (and the ensuing overseas commitments) as a civic distraction, that he realizes America is not omnipotent and intends to behave that way--telling that truth would require confronting a whole host of contradictory feelings America has about itself. That we can do anything!; that we're terribly and unjustly overburdened. That we're the best!; That we've been in decline throughout living memory! That everyone needs to do what we say!; That we don't feel like making any sacrifices.

The usual talking heads we see complaining about the Obama foreign policy and mostly would-be directors of a much better political/theatrical presentation than currently on offer. They are interested in *exploiting* these contradictions in our attitudes.

Someone we don't like ruling Syria? Well how can Obama possibly let that happen? He is weak!

But this is a very easy complaint to make. The suggestion being that a "strong" President never has to live for long with anything he doesn't like. But is this true? Of course not. And when asked particularly about what they are suggesting ought to be done, the essential actions are always either in the past or nebulous in the extreme. Apparently Obama didn't sacrifice the right chicken to the right God or something, and if he had Bashir Assad would have been smothered in his crib as an infant. Or something. I can never quite figure out what is being suggested.

Which is nothing new, except for the fact that journalist who cover this sort of thing seldom push the questioning to the point where real answers must be forthcoming. In fact, they are complicit in mystifying foreign policy because that give their sources (and they themselves) an aura of privileged knowledge (however nebulously presented) for access to which we are dependent on them.

When in fact, foreign policy is largely about learning to lose well. It's about acknowledging where you can't win or can't win without undue risk. It's about scooping up an easy win here or there and putting your effort and taking your risks where they really matter. It's about playing a high stakes game. It isn't primarily about principles and it isn't theater. There are disagreements about how that high-stakes game should be played, about whose bluffs should be called when; about when risks are justified.

But those who image of foreign policy seems to be Wagnerian bellowing from the "World Stage"--well, there's a reason they aren't at the table, even in their childish imaginations!

Saturday, May 24, 2014

The Irish Pub: Drinking and the Irish

OK: More cliches.

The Irish are a bunch of drunkards.

Well, not really, but the Irish (I'm mostly of Irish extraction, btw. Obviously.) have had an interesting and complex relationship with booze. Alcohol is very much woven into a lot of Irish sociality. Whiskey is customary at get-togethers in Ireland much like vodka is in Russia. Just recently in Ireland I've seen the whiskey broken out on a few occasions when cousins or old friends meet around a kitchen table. And neighborhoods and villages have traded gossip and lamented and shared news and talked politics over pints in pubs.

There have, of course, been problems. The Irish journalist John Waters has written that “Drinking in Ireland is not simply a convivial pastime, it is a ritualistic alternative to real life, a spiritual placebo, a fumble for eternity, a longing for heaven, a thirst for return to the embrace of the Almighty.”

The drunken Irishman like the Irish pub is a cliche, but not just a cliche. There's a truth behind it. During the long struggle for independence, Irish patriots often lamented their biblious countrymen. And the Irish middle class, even when they were not slavishly emulating the English middle class, went to great lengths to separate themselves and their customs from taint of alcohol. A whole host of problems seemed to go hand in hand with Irish drinking both in Ireland and abroad: unemployment and underemployment; gambling; domestic violence and other crime; prostitution; disease.

These aren't just sepia-toned 19th century problems, either: Irish drinking is on the rise in the 21st century, and many of these associated problems are still felt quite acutely. There's actually a bit of a moral panic going on about it in Ireland right now. (Perhaps symptomatic, but very well done, really: The Irish Times' Sobriety Diaries.)

Given all these historical and contemporary problems, you'll see in certain quarters a resentment that Irish social life should be so closely associated with the pub. Or that Irishness itself should be so associated with it. You'll see people who are saying "good riddance" to the pub.

I am not among these. While I fully recognize the problems inherent in tying social life and personal life so closely with dispensaries of alcohol, I also love and long for the sorts of "third space" pubs have always been at their best. And I also highly value the pleasure and the dis-inhibiting influence of communal drinking. These things are not to be discounted, I believe, even in the face of all the obvious problems that drinking leads to for many among us.

More later.

The Irish Pub

http://upload.wikimedia.org/wikipedia/commons/5/56/Paddy_Foley%27s_Irish_Pub_Dresden_2.JPG
Paddy Foley´s Irish Pub. Schandauer Straße 55, 01277 Dresden, Germany

I'm just back from a trip to Ireland, which was wonderful in a lot of ways and a bit worrisome in some others. One worrisome thing was the state of the Irish pub.

The Irish pub is something of a cliche here in America. Literally thousands of corporate and not-so-corporate bars play to the image of the Irish pub as the beating heart of the community; the center of boozy sociality and fun. I once worked for a man who made a fortune supplying bric-a-brac for these and similarly themed bars.

But there was a truth behind the stupid cliches: pubs in Ireland were often vital social centers for the communities in which they resided. Being in and participating in the din of conversation in a smoky, too-warm pub with a pint in front of you was always one of the highlights of any trip to Ireland.

But Ireland and its pubs have undergone many changes since I was there last in the mid-nineties. Ireland is, decidedly, part of Europe now. The economic boom I saw happening then has busted, and there has been a further boom and bust in the real estate market. Ireland is no longer a country remarkable for being mostly empty and is no longer remarkable for the number of its younger people working abroad. Those younger people stayed home in the nineties to take advantage of the economic good times, and after that prospects elsewhere were hardly sunnier than prospects at home.

The population, which had stayed at about three million for decades, is now steadily closing in on five million. The change is most obvious in Dublin, which has become a truly expensive cosmopolis, but every county in the Republic is seeing substantial rises in population.

I've lived in and visited many bustling towns--New York City, Philadelphia, London--but today's Dublin may well take the cake as the most harried-feeling of them. This is indicative of some great changes in Irish culture, but also of some lingering elements of the old Ireland.

One reason why Dublin is so harried is that the people seem to want to live up to their self-image of Dublin as a modern, bustling "world city." But another reason seems to be that the Irish feel that their current economic difficulties and austerity are their just deserts for having so enthusiastically embraced Mammon. Guilt is a far stronger current in Ireland's response to the economic travails of the last 15 years or so than America's. To be harried and hurried is a sort of ritual penance for having enjoyed wealth, for having central heat, for having borrowed money, for having used a credit card, for having eaten pate.

This guilt, played upon pretty nakedly by the government's "get working" campaign, shows how, in some ways at least, the new Ireland isn't so different from the self-hating Ireland of the past. (Irish self-hatred, while not a big part of the cliche-Ireland we see so often, is an absolute fixture of Irish literature and cinema.)

In the pubs, though, change is obvious. Older locals--the former beer-soaked, urine-smelling centers of neighborhood sociality--are now beer-soaked, urine-smelling centers of not much. Young folk don't frequent the locals as their elders did before them. This is partially because there are so many young folk around (the 15-30 cohort is significantly larger than, say, the 35-50 cohort), that they have developed a pretty strong generational consciousness. This big cohort also coincided with a number of great changes in Ireland--the growing realization that Ireland could be part of Europe, and not just a former part of the decaying British Empire; and the growing realization that Ireland need not be poor and backward, that Ireland could be a place where others came seeking opportunity, not the place that supplied cheap labor to construction crews in London and Irish-themed pubs in America.

Forgoing the traditional pub is, apparently, part and parcel of becoming the new Ireland. But the new Ireland is a bit soul-less, and, in many ways seems to partake a little too strongly of the guilt and self-hatred of the old Ireland. Slavishly becoming more European or more American upper-middle-class is really little better than slavishly becoming more English. In fact, in some respects worse.

The old Irish pub that I knew--with Guinness and strong lager as the near-exclusive beverages, with damn-near everyone smoking, with windows shut against fresh air (known in Ireland as a "draft) with a universal and absolute determination--was hardly a paradise. And it wasn't really "traditional" if what you mean by that is unchanging. The lager of the 80s and 90s was a change made in response to changing consumer preferences, as were the video game machines, the televisions, the formal music events, etc. etc.

The pub as vital social institution always meant that the pub changed with the society around it. And while Irish society pre-1990 changed more slowly than it seems to now, it was not unchanging. But the rate of change does now seem to be overwhelming the old institution, and some of the changes happening inside of pubs seem to be mitigate against its social role in a way that, say, the introduction of lager did not.

Can a genuine & distinctive entity know as an Irish Pub survive when the TV is omnipresent and always on? Can it survive when loud music makes conversation impossible? Can it survive when there is so little distinctive or communal about it? Can it be reinvented to fulfill more or less the same social role in a new social setting?

I've ordered a (seemingly pretty pessimistic) book called A Pint of Plain by Bill Barich on the history & decline of the Irish pub to help me along toward answering some of these questions and I'll be posting more here as soon as it arrives.

Monday, March 17, 2014

Political Theater III

Picking up on a theme I wrote about (wow!) ten years ago . . . the press and talking heads still seem to be obsessed with "signalling" and "messages" and "perceived weakness" as if global politics were Kibuki theater and not a pursuit of real, material advantages.

Part of this is because theater is something we all understand. We all watch TV and we all are familiar with the images of power and gamesmanship we've seen on shows like the West Wing. Very few of us are familiar with the real stakes in international politics and the cards in the hands of the multitudinous players. So it is far easier for us and far easier for our hired storytellers to tell the story of international conflict as a continuation of West Side Story, where there are no stakes to speak of, just posturing, pointless risks and revenge. Conflict boiled down to pure image is something we can all understand.

And there is another side to this as well. International politics, like poker, is about maturely accepting the right losses. For a long time the US had such strong hands that it seldom had to do this--we could chase most pots, because we had the good cards. Today, the good cards are spread around the table. We've still getting good hands, but we have be a bit more careful how we play.

This conflicts pretty seriously with our self-image, though. We usually think of ourselves not as a player at the table, but as above the game--as an officiator or policeman who governs the game. Well, we're not. We cannot control how other people play their hands. Which means that we will lose some pots. We have to accept this.

We also have to accept that the behavior of others is NOT a mere reflection of our behavior, of our "perceived weakness" or "perceived strength." The other players have hands that they can play and multiple interest groups and audiences to whom they are playing.

Did Putin invade Ukraine because he thought the US was weak? The way you answer that question isn't to ask "Well do I think we look weak?" The question is about Putin, about Putin's interests and Putin's perceptions. So, assuming Putin wanted to act to keep Ukraine within his sphere of influence, what had he to fear from the United States? Chest thumping from the President? I am skeptical that Putin actually fears that. Armed intervention? Well that's be something to fear, but clearly that has never been in the cards anyhow. Not under Bush, not under Obama. Sanctions? Well, only a moron would have expected these not to come, and they will. And as the Europeans begin to really consider what a belligerent and expansionist Russia means to their interests, we will see those sanctions get pretty tough, particularly if Russia pushes its advantage in the region (they're already there, we & the Europeans aren't) too far.

So why now? Because we look weak now? No:Because up until now, Russia had a compliant regime in place in the Ukraine. They had been using all sorts of covert means to ensure that was the case. Why didn't they intervene during the last pro-Western regime was in power--they did, refusing to sell gas to the Ukraine. But during the last pro-Western regime nothing of a lasting change was made as the pro-Western forces spent a great deal of time fighting each other.

Now the Ukraine–European Union Association Agreement is on the table. That could turn into a lasting change of orientation for Ukraine. That changes things. That makes Putin willing to seize the pretext of constitutional crisis in Ukraine to take what he can, in spite of the risk of sanctions. Taking this opportunity has shored up his power base at home and may get him a naval base and, perhaps, in the end, a satellite country to his west. The cost for all of this will be more or less the same cost he'd have paid under Bush--sanctions and a more hostile posture from the US and many of its allies.

Under either President, the maintenance of that hostile posture would have been difficult--the Europeans have complicated interests and fears in regard to Russia. Under Bush that would have been compounded by the fact that they despised him and his chest-thumping. (See, for instance, the amount of cooperation he attained for his Iraq invasion).

Bush's approach would probably have been to loudly declare that any Europeans not going along with the strictest possible sanctions regime were miserable appeasers and we'd no doubt seen plenty of tiresome clips of Neville Chamberlain and "Peace in Our Time." We'd have looked like "leaders." Administration officials would have taken every opportunity to bask in their Churchill, but such an approach would have distracted the Europeans: their hate and resentment of us coming to the forefront just as we'd like them to be focused on their fear and loathing of Putin. Our image: tough guy defenders of freedom. Their image: vacillators. End result: Russia goes largely unpunished. So if image is your be all, end all. We win. But if the point is to discourage people like Putin from misbehaving . . . we haven't accomplished the goal, we've only covered over our failure with a lot of posturing.

Obama takes a quieter approach, and in this case that's all for the good. By NOT being the leader, by NOT calling attention to our brave opposition stance, by NOT calling any who disagree miserable appeasers, by NOT becoming the issue, he lets the Europeans concentrate on Russia and come to their own conclusions. And they are: Putin is scary.

Probable result: US image: lots of hand-wringing over not "leading"; European image:  determined; probable result: some pretty stiff sanctions, significant cooperation between US & the rest of NATO to stem the threat posed by Russia.

If you are concerned only about image, this is bad. We aren't leading (to a failed outcome), we're merely influencing (to a desirable outcome).


Always playing to our self-image is like going all-in on every hand in poker. It's a stupid way to play. Image is only one factor in winning at this game. Over the long haul you win by advancing your interest with reasonable expenditure of resources. That means walking away from some pots; that means letting the bad guys win sometimes; that means letting YOUR calculation of interest determine your behavior, and not being a predictable slave to some notion of credibility that is easily manipulated by the other players.

All of the hand-wringing over our "image" and how, apparently that's all that matters in foreign policy is hand-wringing by cynics (who know better, but who also know that most people DO NOT know better) and the many naifs who can only understand cheap melodrama, not actual politics.

Saturday, November 02, 2013

The problem with The Trolley Problem

Opened the last Atlantic and finally read the piece I'd been putting off for some time--Robert Wright's essay on innate morality ("Why We Fight and Can We Stop" in the print version). I put it off because I often find Wright to be . . .umm, rather credulous, I guess is the phrase I want. He seems to believe strongly that information about the origins of a human trait is always immediately useful in guiding us as to how to deal with that trait. I think that's a hopelessly naive attitude, so I often find Wright's work to be more than a little patience trying. On the other hand, he usually deals with interesting topics and its often pretty productive in making me try to figure out why I think Wright (or his subjects) have got something wrong.

For instance in this piece Wright prominent features the work of Joshua Greene, who has helped make the Trolley Problem--an ethical thought experiment--famous. Here it is as conveyed by wikipedia:
There is a runaway trolley barrelling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. Unfortunately, you notice that there is one person on the side track. You have two options: (1) Do nothing, and the trolley kills the five people on the main track. (2) Pull the lever, diverting the trolley onto the side track where it will kill one person. Which is the correct choice?
There is a variation:
As before, a trolley is hurtling down a track towards five people. You are on a bridge under which it will pass, and you can stop it by dropping a heavy weight in front of it. As it happens, there is a very fat man next to you – your only way to stop the trolley is to push him over the bridge and onto the track, killing him to save five. Should you proceed?
Many people opt to pull the lever but not push the fat man, in spite of the fact that the end results are the same. Some use this result to argue that human moral reasoning is essentially irrational, or that our moral reasoning has less to do with outcomes than it has to do with keeping ourselves above moral reproach:
. . . people who obey their moral intuitions and refrain from pushing the man to his death are just choosing to cause five deaths they won’t be blamed for rather than one death they would be blamed for. Not a profile in moral courage! 
But there is a big problem with this sort of thought experiment--they depend on the subject feeling a sense of certainty about outcomes (the five people in the car are definitely heading to certain death; the fat man's fall will definitely stop the train; the diverted train will definitely hit the one person on the track you divert to). Our life experience--the experience our brains have evolved to cope with--is all abut dealing with unexpected contingencies. We very seldom face situations where we know for certain what the consequences of our actions will be, and our natural suspicion when faced with the fat man situation is not that we'll have one death blamed on us--it's that we'll have six. That's how we think, even when told not to. We are beings who have evolved and grown up to deal with unexpected contingencies--we actually come to expect them in a way, and we tend to act modestly as a result. That's what the Trolley problem really points out. That's why we like Captain Kirk's solution to the Kobayashi Maru dilemma (he cheated), because in life there are lots more contingencies, uncertainties and opportunities than there are in tests and experiments (experiments being designed to absolutely minimize all of these). By attempting to test real-world judgements with a controlled experiment, all the Trolley Problem does is reiterate the difference between experience and experiment.

Wednesday, September 25, 2013

The Two Culture Again . . . again

Now this whole thread of writings and counter-writings around the idea of the relationship between the disciplines--science & the humanities, specifically--comes out of a Steven Pinker article in the New Republic. The article is a rambling advocacy piece for science as against the humanities.

Pinker has long been a cheerleader for science and has long had an axe to grind with the humanities. And, in fact, most of his points against the humanities go back twenty years or so. Some of them aren't really very current anymore. But since Pinker has a profitable sideline in attacking things like postmodernism, philosophical anti-science and political correctness, he has no interest in learning that none of these things is really very important anymore.

Is there some hostility to science out there still? Sure. But you'll find most of it attached to environmentalism. Not postmodernism. Some streams of environmentalism is ambivalent about science because science has given man the capability to do a great deal more damage to the Earth than he otherwise could have wrecked. For those who, seemingly, value ecosystem above all, this is a black mark against science--it has given dangerous tools to the baby (Us), with predictable results.

And yes, there is ambivalence in academia and everywhere about science that one doesn't find, say, with history. Why? because history is not powerful. Even if one is inclined to think history is bunk, one doesn't fear it. Science, though, is power, and a lot of people in the humanities think that power needs to be constantly checked and curbed.

This current push for science to, essentially, colonize the humanities is but a case in point. Years ago Pinker wrote a book called the Blank Slate, which as a piece of argumentation is an utter piece of crap. Pinker, or more likely his graduate students assiduously mined social science and humanities texts for evidence that human nature was ignored, Pinker then inserted the quotes, in many cases baldly misinterpreted them, and then railed against the notion that human nature was unimportant,past and especially present.

The story that Pinker doesn't tell you is that the concept "human nature" was a huge impediment to the advancement of the social sciences because it was an empty signifier--it meant whatever the speaker would like it to mean at the moment he (inevitably) spoke it. By dispensing with it, observers were able to move on to observing how social interactions social customs and social institutions actually worked. And they could ask questions about the nature (or possible nature) of social groups as apart from individual human nature. No one ever believed that human beings were infinitely malleable, because that notion is absurd on its very face.

But at some point a radical feminist must have taken Pinker cruelly to task for his "essentialism" and he's been taking his revenge (on us) ever since. Frankly it's long since gotten beyond tiresome. Yes, every social science was said to be essentially an expression of "human nature." That's what thinkers in the 18th Century said pretty much reflexively. And emptily.

Today science has some solid things to say about human nature. But human nature is a big & complex thing, and most of science's observations are narrow and limited. And they often seem to contradict one another as far as their take home message about "human nature." And there seems to be some very sloppy thinking going on in science around this idea--more sloppy and more dangerous than simply bracketing the term--agreeing that we can't agree what the term means, so we can't use it to explain things--which is what the humanities have largely done. (On science's sloppiness around making determinations about human nature see this piece from Jaak and Jules Panksepp and its sequel.)

Using "human nature" as a means to "settle" disputes in the humanities is frankly folly. While science has certainly learned some things about human nature since the 18th century, they are a long way from defining the term. A long, long way. The picture is incomplete. But what Pinker and other advocates of this idea are looking for is to use and expand the authority of science, even where science doesn't really have answers to offer. In fact, so far we've seen little new insight created in these fields by science. That's bound to change, but that's where it stands at the moment.

Mostly because all heuristics and posturing aside the humanistic tradition has never really abandoned the concept of human nature, and the observations of humanists about human nature are at a far better developed and nuanced than science's are at the moment. The notion of human nature is, of course, still deeply fraught with conflicts, but science isn't about to settle those

And special pleading by those who don't know the humanistic tradition--the very best repository of our experience and thinking and knowledge on the question--is no help.

The Two Cultures. Again.

Been reading a fair deal of Jerry Coyne lately. He writes the whyeveolutionistrue blog. I am in full agreement with his two basic points: evolution is, indeed, true. And God, I'm afraid, is a myth.

And Coyne is a great scientist. His work on speciation is a big deal (haven't read it myself, but I'll trust other folks' opinion on that).


But (here's the but) he reads and writes like a bad college freshman. He doesn't recognize changes in voice, he can't understand subtle arguments or distinctions, he hunts through the works of others for selective quotes to condemn and mock regardless of the passage's intended meaning. In short, as a blogger, he's like a malignant growth on the positions he supposedly represents.


And now he figures that he should share the blessings of his blinkered and deficient worldview to the rest of the world . . . to the humanities in particular. Here he is arguing against someone proposing that the path of influence between the two cultures ought to be a two-way street:



I take issue with that on two grounds: scientists are so pressed for time that we can barely get our own work done and, more important, the potential benefit of philosophy to the conduct of science seems less to me than the potential benefits of infusing humanities with science—benefits described out by Steve [Pinker] in his New Republic piece.
I am not saying that philosophy or the humanities are without value. Far from it. What I am saying is that the marginal benefit of adding more science to the humanities is greater than vice versa. I personally absorb tons of what could be considered “humanities,” including literature, nonfiction, art, and philosophy. They’ve enriched my life immensely—but I can’t say with confidence that they’ve made my science better, or different.  I’d still have published the same work on speciation if I’d never read philosophy, although I wouldn’t be writing this website.  My benefits are personal, not scientific.

Though Coyne's benefits from the humanities may be "immense" I have to question how good a judge he is of his own case. For one thing, he can't read properly.


For another, we have to ask, aren't Coyne's humanistic deficiencies important to his blog, where he sets himself up as a spokesperson for science and atheism? And doesn't that blog trade on the authority he's gained in his scientific area of specialty in order to advance views that are more directly related to his rather incomplete humanist education that his scientific expertise?


Just a small for instance (you can find these practically anywhere Coyne reads unsympathetically). . . here Coyne tries to refute Gary Gutting, who is advocating for more philosophy in science as well as more science in the humanities: 



Gutting: And to tell the truth, rather than speaking about the theory of evolution, it is more accurate to speak of the theories of evolution. The use of the plural is required here—in part because of the diversity of explanations regarding the mechanism of evolution, and in part because of the diversity of philosophies involved. There are materialist and reductionist theories, as well as spiritualist theories. Here the final judgment is within the competence of philosophy and, beyond that, of theology. . .
Coyne: What are those materialist and reductionist theories, much less the spiritual ones? I am aware of only one going theory of evolution, which, while it has its controversial parts, does not deal with “materialism vs. reductionism” much less “spiritualism.”
How does Coyne go from what Gutting wrote to "materialism VS reductionism?" Gutting says AND, clearly meaning to pair the two rather than oppose them. Why can't Coyne tell the difference between what people say and the stupid things he'd like them to have said? Because however life-changing his experience with the humanities has been, he is essentially a philistine. He appreciates the humanities, but he is no judge of their importance, either in his own life or in society, any more than I'm a judge of differential calculus. His very limiting of the impact of the humanities to the "personal" marks him as such.

Stephen Gould had a bit to say (a lot, actually) about scientific philistinism, which he saw as not just happenstance, but as a pervasive and actively cultivated part of the culture:
Virtually every empirical scientist has a touch of the Philistine. Scientists tend to ignore academic philosophy as an empty pursuit. Surely, any intelligent person can think straight by intuition. . . . Although I will try to refute Bethell [an opponent of evolution generally], I also deplore the unwillingness of scientists to explore seriously the logical structure of arguments. Much of what passes for evolutionary theory is as vacuous as Bethell claims. Many great theories are held together by chains of dubious metaphor and analogy.  (Darwin’s Untimely Burial, 1976)
The humanities are about reading carefully, considering fully and expressing accurately. They probably play a small role at the workbench, but work isn't done at the bench for no reason--it's done to advance human interests. And when it comes to analyzing, evaluating, synthesizing and applying those bench results we often see scientists failing badly. Because they are "too busy" to have had a proper humanistic education.

Coyne's problems are indeed general. Steven Pinker, whose New Republic article he cites is a repeat offender in the misreading, misinterpreting, misconstruing and miscontextualizing lockdown. For which Coyne's former student and co-author H. Allen Orr (not a philistine) has repeated taken him down in print (see, here and here). Pinker and Coyne--trading off of real or supposed scientific expertise--have become prominent spokespeople for science. Neither of them could hope for the sort of prominence they have on the basis of their at-best-shaky scholarship, injudicious writing and narrow-mindedness. But their science background, real or imagined, gives them authority in areas where they do not deserve it. And that is the danger of the one-way street of influence between science and the humanities. Scientists are, typically, deficient in some very important areas of human discussion and decision making. But they are completely without compunction in trading on their scientific authority in venturing into every other field that kindles their mild or passing interest. Even if they have little idea what that field actually does, how it works or what it's done. Like the humanities.


The potential benefits of more science in the humanities are definitely worth considering. But there are potential pitfalls as well, which Coyne doesn't see or doesn't care to see. And those pitfalls are already well on display in the work of Coyne and Pinker, who I fear probably are better representative of the *best* than the worst science has to offer at present.


More later.


Sunday, December 11, 2011

The Two Cultures

My title here is a phrase strongly associated with a controversy stirred up by CP Snow back in the Space Age, when many in the West feared we were falling behind the Soviets in science and technology. For the English and the American upper class, part of this worry was related to the bias of their higher education systems toward non-technical subjects. What Snow observed was that as our culture matured, the most highly educated people in the humanities tended to know scandalously little about science; and likewise, those with the greatest technical knowledge tended to be rather uncultured.

Snow was well aware that there were always and would always be exceptions to this general tendency. In fact, Snow himself was one of those exceptions, being both a respected novelist and a student of physics (he earned a doctorate in the subject). So the many takedowns of the Two Cultures idea you may find online that say, point to JBS Haldane--look, a mathematically minded scientist who could write! who knew Greek! who wrote history!--as proof there aren't two cultures, are completely missing the point. Haldane and Snow were exceptional. Snow didn't argue that it was impossible to inhabit both cultures--being who he was, how could he?--only that it was not the norm.

In speaking of Snow, we should also acknowledge how much his piece is of its time and place: Part of Snow's thesis, an important part, was his particular attack on the English school system as a source for the split between technical and cultural knowledge. (He actually holds up the American university system as a positive counterexample.)

However, the basic conflict he is observing goes much farther back, to the very beginnings of the modern world and the conflict between the ancients and the moderns in the 17th century. That conflict was essentially a conflict between ancient wisdom and modern, mostly scientific knowledge. Swift's Battle of the Books was a satiric look at this conflict which came down, as we might expect from a literary man who hated math, pretty heavily on the side of ancient wisdom. Over the long haul, though, science has mostly won this battle--shaping and changing our world to an extent that even Swift could hardly have imagined.

And, worse still from Swift's point of view, science is increasingly the arbiter of truth in our society.

But not the only one. Anyone taking a look at say, the controversies over global warming can see that science often has an uphill battle against "common sense" when its truths are inconvenient. Science is, no doubt, still the servant of our desires; though a servant upon whom we are as dependent as Wooster is upon Jeeves.

And yet our universities are--still!--filled with people who smilingly admit to incompetence in basic mathematics; who don't know anything about science aside from the fact that its advances sometimes harm the environment; people who criticize science but who cannot distinguish the real thing from ridiculous parody. If Bertie Wooster were a doctrinaire ingrate as well as an ignorant fool, he'd be the model for many of our present day humanities professors. But such a character could never win even the provisional sympathies of any reader.

Such a response to science does little honor to the tradition that Swift defended. As was the case in Snow's day--the worst offenders in the two culture business are on the side of the humanities. One is far more likely to find an articulate and cultured scientist than a scientifically knowledgeable humanities type. There are more Goulds and Lewontins and Orrs out there than there are George Levines.

Which is a shame. Because there are also scientists out there who have very little knowledge of or respect for the Western tradition who now want to explain it all for us. Who don't seem to appreciate that you cannot explain "it all" without a conception of what "it all" is. Who don't seem to realize that there simplistic explanations have been around for a long, long time, and have been roundly and soundly rejected by those to whom the phenomenon in question is most familiar.

One of the things I hope to see in future is a generation of humanities and social science people who embrace science but are not in thrall of it.

Sadly, I haven't seen much that looks much like that, though.

Sunday, December 04, 2011

Our Runaway Economy




I was intrigued by the dueling opinion pieces by Paul Krugman and David Brooks in this past Friday's NYT (02.12.2011). Down the right edge Paul Krugman makes his Keynesian, technocratic case for government intervention, in the fat bottom piece David Brooks extols the Germans for defending "values," "effort," "self-control," "merit," and "enterprise" in resisting that same intervention.

While Krugman's knowing, arrogant tone have long since worn thin on me, he's still a) an actual economist and b) has so far been consistently right on how this crisis would play out, as opposed to, say, folks like Niall Ferguson, who have been just as consistently wrong.

Brooks is of course right when he says that there is a political cost to the "value blindness" inherent in the purely technocratic calls for crisis intervention right now. When irresponsible governments get "bailed out" it does call the legitimacy of the status quo into question (thus, the tea party movement).

But, as Brooks acknowledges over and over again, we are in a crisis, and crises like these demand decisive actions that may or may not comport with your general morality.  What crises demand is action, not actions contingent upon something else happening. If, as agreed, we have a crisis, then we need action to stave it off. It's as if I saw a car beginning to roll freely down a hill: my first action is not to seek out the driver to make him or her promise to use the handbrake in future. My first action is to see if this person was also irresponsible enough to leave the door unlocked.

Is it fair that I should have to do this? No. It's what you do to prevent a calamity.

And that gets to one of the funny little things about living in a complex society--questions like "Is this fair?" or, more generally, "Does this comport with how I conduct myself or my family life?"--are often the wrong sorts of questions to be asking. Why? Because the point of the system isn't to be fair. The system wasn't made to retell the story of Pilgrim's Progress or Horatio Alger. And this has always been the case with how the managers of that system have made their decisions.

Capitalism over the last 100 years or so has gone through some interesting developments--it's interaction with the more inclusive democractic political system has become both fruitful (witness, the postwar economic boom in the West) and more fraught (witness, the post-1980 shift in reward structure, the Occupy Wall Street movement). For people like Brooks the story of our economic system--the deserving are rewarded and the undeserving punished--is more important than tending to the technical function of the system because of that now very strong interaction between democracy and capitalism.

But the fact is, that story is a lie and always has been a lie. Are there strategies that you can find out about that are likely to lead to success in our system? Yes, certainly, particularly when it is working well. Do those strategies necessarily have something to do with deservingness by some other yardstick (moral, utilitarian)? No.

A crisis is not a time to try to shore up tired old lies: it is a time when we ought to be being a bit more honest with ourselves. If "virtue rewarded and vice punished" is what you are looking for from your economic system, capitalism is not your baby--we can condition the competition and power plays of capitalism so that we reach this outcome more often, but we have to do it. It won't do it itself. It isn't designed to do so.

And there are absolutely no economic rewards for virtue in and of itself. If your virtue turns out to be economically non-viable, you don't get an economic reward. The moral/political realm and the economic realm are separate. They interact constantly, but we should really stop encouraging people--as Brooks is urging--to think they are the same thing.

This is really part of a collective growing-up we've got to do, akin to discovering that your parents were not the paragons of the virtues they so strongly urged on you. A crisis is a time for a bit more truth. Let's acknowledge that preventing the calamity and the virtue of the people rescued from it are two separate issues to be dealt with in their own proper occasions. So lets see if that driver's side door is open, shall we?


Thursday, November 10, 2011

First, Let's Get Rid of the Economic Lawyers


I was amused by an exchange between Ron Paul and Ben Bernanke that happened this past summer but I only came across recently:

Paul: Do you think gold is money? Bernanke: (pregnant pause) No. It's not money? It's a precious metal. Even if it has been money for 6,000 years, somebody reversed that and eliminated that economic law? Well, it's an asset. Would you say Treasury bills are money? I don't think they're money either, but they're a financial asset. Why do central banks hold it? Well, it's a form of reserves. Why don't they hold diamonds?Well it's tradition -- long-term tradition. Well, some people still think it's money. In the U.S. anyway, those people are wrong. 
From Daniel Indiviglio's blog at The Atlantic
The economic law that gold is money? I must have missed that. Is it in the Bible or something?

Amusingly, Ron Paul's fanboys think that this exchange somehow displays Paul's fantastic grasp of basic monetary issues and Bernanke's inability to deal with him. Anyone watching the video of their exchange will see Bernanke is clearly bored during Paul's Long run-up to this bit of questioning.

What all of this boils down to is that Paul believes gold has transcendent value. As established by God, no doubt and that value is to be measured only in gold. Or should we say Gold? This is merely arbitrary. An alternative fiat to fiat currency, and a bad alternative for reasons I'll explain below.

And Paul thinks that value itself has to be a transcendent. He offers no reason why this should be so.

Economic value is contingent. It always is. It is less contingent in a developed economy such as ours, but it is contingent. A smallholder brings in 100 bushels of wheat one year, and lives comfortably off the proceeds one year, the next year he brings in 150 and barely scratches a living. How could this be? 150 is bigger, and therefore more valuable than 100, no? But having 150 bushels of wheat in a year when there is too much wheat is worse than having 100 when it is scarce.

The one economic law we should all know well, the law of supply & demand, the basic observation behind capitalism, says "value is contingent." And that law applies to everything, Gold included.

Which means that the wild fluctuations we've seen in the price of gold in USD over the past 40 years or so are NOT just a reflection of the value we put in the dollar, but also a reflection of the contingent value of gold as a commodity. In a flight to quality, gold goes way up and US bond rates way down because there is additional demand for those reputed safe harbor investments, this over and above any implied valuation of the currencies involved.

Now why does gold have a reputation as a "quality" investment? Because the supply is relatively stable and predictable.

So why not tie the value of your currency to gold? Because one of the big reasons you have a currency is the first place is to enable you to manipulate its supply to deal with economic contingency. If you tie the value of your currency to gold, then you've taken this possibility off the table.

And why would you want to manipulate the supply of currency? Because economic value is not measured in Gold, it is measured in human well-being, in people eating, in their having a warm & dry place to stay, in having employment and enjoyment, in having some security in the future. That's "value." And while this definition may sound a little bit nebulous, the fact is that this gets to the real gist of the matter, where merely posited abstract, supposed absolutes, like Gold, just defer the question infinitely--isn't Gold shiny enough to make you stop asking questions?

Value is rooted in perceived human good. Full stop.

Money is an exchange medium which creates a common index to measure value of very different things, because it--money--can be exchanged for many different things--a weight of gold, some barley, a night with a prostitute, a slave, a jug of wine, a pair of shoes, the services of a porter for a day. However, because all of these things are subject to the law of supply & demand, including money itself, the value, as measured in this common index, fluctuates constantly.

Money is a representation of value. Value fluctuates constantly. And the value of the representation itself is subject to change. If the representation is easily counterfeited, it's value will inevitably fall as people do just that.

Hence gold--it is difficult to counterfeit and supply was pretty stable and it thus became a great medium for early exchange. But gold currency had its problems, too. Gold was very scarce and it was difficult to make coinage small enough to represent the value of many many trades. (Most people in the year 1000, say, would never have used a gold coin for any transaction for their entire lives.)

And since weighing coins and calculating for purity was an inconvenient process to undertake at each exchange, coin the clipping and debasing of coinage could be quite profitable enterprises for both governmental and non-governmental entities.

And the supply wasn't always stable. A major strike could wreak havoc on the value of gold as measured in (at the time) more stable commodities. For instance, after the discovery of America, or the California discoveries circa 1850, gold and silver were suddenly much easier to come by in Europe, causing inflation.

And so money was ever a problem, not just in the household economics, but also in any exchange. Coinage was non-standard in every way. The solution to this problem was representational currency. The currency would be essentially valueless in itself, but it would represent and be exchangeable for gold, which represented actual value in a hard to counterfeit way.

But that didn't solve all the problems with Gold. Over the short term, tying your economic policy to gold mean long term price stability, but relatively violent fluctuations over the short term. And since we've all got to put food on the table and pay next month's rent in the short term, this wasn't so good for regular folk. For people who were stashing away large sums of money off of which their grandchildren would live idly, the sub 1% long term inflation rate was great. But near 60% (+40/-20) variations in the annual inflation rate were hardly the stuff that made wage-earners smile and love the gold standard. And they didn't.

In short, a government that cared about wage earners and small holders needed to be able to act to create short-term economic stability, even if that was at the expense of some erosion of long term price stability. This isn't so bad a trade, and the proponents of tying money supply to the gold supple ought to be a bit more honest about what the gold standard brought us--lots of volatility, big increases and decreases in prices that tended to cancel each other out in the long run.

If what you long for is more short-term stability, for your dollar to be worth the same this year as last or two years ago, then Gold is not the answer. In fact, it is precisely the opposite of the answer.

Contingency can't be escaped. Gold is not God--He's said as much. The Gold standard does not bring transcendent truth to our financial transactions, it is only mental children who think so, or even hope for it. Financial transactions depend on other people. There is no guarantor for your wealth or goods against any and all contingencies.

Speaking of children, watching the video of the Bernanke/Paul "showdown" really impressed me with how childlike Ron Paul can be. Paul querying (and wearying) Bernanke is just like watching a toddler trying to find the man inside the TV.

Well folks, there is no man in the TV, there is no ultimate truth behind a dollar bill. Just our collective promise and the likely prospect that others will accept it just as you did. Paul thinks that kind of social trust is a Ponzi scheme. Actually it is life as fully cognizant adults live it.

There is no "law" that says only gold is money, just as there is no "law" that says your parents will always be there for you, or that you will always be the center of the universe. Your parents will die, you are one person among billions, and you will have to face and deal with life contingencies one way or another.

But, hopefully not with tiresome delusions like Gold.

Monday, November 07, 2011

The Education Bubble

http://newssun.suntimes.com/news/8655507-418/analysis-is-student-loan-education-bubble-next.html

The link above is to a relatively thoughtful and thought-provoking article from the AP on the current expansion in higher education & higher ed financing during this financial crisis. Much of this is only to be expected when people lose high-paying manufacturing jobs that aren't coming back and look to tool-up for something comparably paid but quite different in the future. But there are, as the article points out, some signs for a little bit of worry about this new wave of college attendees and the debt they are accruing.

Perhaps the least interesting part of the article is Peter Thiel's fellowships. The $100,000 awards handed out to young people *not* to attend college are great headline fodder, but they're pretty meaningless to the typical high school student. Did someone look at your business plan and hand you 100K? Then you might not need to go to college to succeed. In fact, since college can be rather time consuming, it might be better to concentrate your attention on a big idea that has already inspired people to lay down some big money. Otherwise, you might want to remember the single biggest indicator of class divide in most of the country: a college degree or lack thereof.

And while you are at college you might even, like Mark Zuckerberg, stumble on that big new idea.

Peter Thiel's big complaint with higher education is that it gets in the way of entrepreneurship. But the vast majority of 17-year-olds have little or no entrepreneurship to get in the way of. They are not destined to be 22-year-old Internet millionaires. In fact, even in some dreamed of future of constant innovation, most people are going to work for someone else, doing something that is not insanely great or startlingly innovative. Basing education policy on fostering future Peter Thiels is like basing it on fostering future Powerball winners. The winners are going to be few and far between and your education policy will have little to do with fostering them or with increasing their number.

What education policy *can* do is provide those entrepreneurs with employees with some basic knowledge of how business finance works, or how to market their product, or how to do basic accounting, or how to hire and fire without getting sued, or how to read a report, understand what it is saying and ask some pertinent and challenge questions of its authors, or how to write the code that actually does the insanely great things would-be entrepreneurs dream up.

That's what education and education policy is for: to create competency and to build the groundwork for excellence. Education and education policy do not exist to foster the very, very small percentage of young people who actually have salable ideas. We rely on those folks to fend for themselves, as they have already demonstrated they are able to do by coming up with these ideas in the first place.

Which brings me to a second point about the education bubble: there is one, but not for the reason Thiel thinks. Thiel's line of thinking tends toward the notion that higher education is irrelevant in some "new world" of constant innovation and entrepreneurial thinking. Actually, it isn't a world, it is simply a small class of young people who have opportunities that outweigh the advantages of higher education. But the this kind of millenarian thinking (everything has changed! We need to change our approach to everything to comport with the new reality!) goes on everywhere and seems particularly resonant these days. Actually it was a big contributor to several of our recent asset bubbles. (Who can forget the 1990s?)

But the real reason there may be an education bubble is that, like entrepreneurship, higher education is relevant, but only if you are in a position to take advantage of it. The expansion in college attendance and higher ed financing largely represents a new cohort to the groves of academia. Their parents didn't complete a college degree and, crucially, they are themselves poorly prepared, academically, socially & emotionally, to succeed at college. 

As I was suggesting earlier, if Thiel were to succeed and got society to foster a significantly larger cohort of young entrepreneurs to develop their ideas, we would quickly start reaching a population of people who didn't have the ideas, the maturity or the personal traits to succeed as entrepreneurs. Traditional higher ed is already at that point: the expansion we're seeing now is reaching a lot of folks who can't succeed in reaping the true benefits of higher education. Largely because our primary and secondary educational systems have served them so badly, but also because their parents have not prepared them to succeed on their own. At anything.

The financial risks here are not really harrowing, and, who knows, there may be enough good bets in this wave of new college attendees to outweigh those who are wasting their (and our) time & money. But, from what I've seen, I'm dubious. And I say this as someone who, like this new crop, had parents who did not graduate from high school and who had to borrow heavily to attend college.

Wednesday, October 26, 2011

Vague-iography: Steve Jobs

I was halfway through Walter Isaacson’s new biography of Steve Jobs when I suddenly went searching through my bookshelf for the book he wrote about Benjamin Franklin. I had read the latter biography when it came out in 2003, and I remembered it fondly. I was trying to figure out why “Steve Jobs,” despite being full of new information about the most compelling businessman of the modern era, was leaving me cold.
Joe Nocera, perhaps, is too kindly a man to suggest the reason for his disappointment. It's not in the author (still the same dispenser of "dutiful, lumbering American news-mag journalese" (Sam Leith) he ever was). It's not the difficulties of contemporary biography itself. It's the subject.

Steve Jobs is quite simply not a man who merits an immediate biography. Not a man whose accomplishments are "enormous" and whose significance must be digested immediately. Not a genius.

What he is is a very successful creator of consumer products, not one of which would never have happened without him. And he is the object of a personality cult. A personality cult that, quite frankly does not speak very well of us. Not because Jobs was particularly undeserving of a personality cult--I think we can confidently say that even taking his many faults into account, he looks quite good next to people like Hitler, Mussolini, Mao, Stalin and Castro. No, the reason the Jobs cult makes us look bad is that the cults for people like Mussolini and Stalin were motivated by Utopian dreams--dreams about peace, social unity, the betterment of society at large--which were perverted by and through these cults. (Or, if you will, the Utopian dreams were carried to their natural dystopian conclusions by these opportunists.) The cult for Jobs is an extension of our obsession with toys.


Now it has so long and so often been repeated that Jobs was a "genius"--that he fomented (at least) three different technological revolutions; that he was an "unparalleled innovator"--that by now this sort of statement is taken for granted and it seems perversely contrarian to deny them. But I would point out that proponents of Jobs's genius seldom go into any detail as to what his actual contribution to the technological revolutions was. They are treated more or less as miracles that took place in his presence, and, presumably, due to his presence.

I am going to try be more specific: I am going to take a closer look at the miracles that have led to Jobs's canonization--the personal computing revolution, the GUI-based operating system, Itunes and the Iphone. We'll leave the tablet computer to the side for the moment, as, as far as I can see, it is a technology still in its infancy and its ultimate impact is still pretty unclear.

Before venturing in, some background: I've been using computers since the third grade--1977. I used and programmed Apple IIs and compatibles since shortly after their first introduction. I've used and owned both Apples and Microsoft-based computers since 1990. My early Apple II experience no doubt makes me more of a Woz guy than a Jobs guy.

The Apple II

The Apple II has become emblematic of the personal computing revolution. Deservedly so. This was the first computer that was truly practical for the non-hobbyist to own and use or for schools, where the Apple II became the computer many children first encountered face-to-face, so to speak.

But the Apple II was not the work of Steve Jobs. It was the work of his partner Steve Wozniak. As has been documented many times, Steve Jobs was not a particualrly good electronics or programming man. Wozniak was. And his hacks, innovations and shortcuts are what made the Apple II such a unique machine in the late 1970s. And it was his respect for other people's ability to further hack and adapt the machine that endeared him to many in the early days of home computing. The Apple II had eight expansion slots for cards to be added to expand its capabilities in various ways; the architecture of the machine was open, and software and hardware were freely developed for it. This made the Apple II the beloved of the hacker and open-source communities, which can trace their lineage straight back to Steve Wozniak.

Jobs's contributions to the Apple II itself were small and in some part unwelcome (e.g. Jobs was hostile to cooling fans, so the Apple II had persistent problems with overheating). While Jobs contributions to Apple the business were considerable--Wozniak and the engineers had very little interest in the business side of personal computers. To a large extent the marketing success of the machine is to Jobs's credit. Jobs, for instance, found the first big financial backer for Apple, Mike Markkula. Successfully marketing a revolutionary new product that someone else made is hard work, but not the stuff of miracles.

And Wozniak's innovations would have had considerable impact even if the Apple II was a failure as a product. In other words, *someone* would have used them to try to push computers into the home and school market.

The Macintosh

In 1979, Steve Jobs and Apple engineer Jef Raskin visited Xerox PARC, a new technology development center. There they encountered the graphical user interface (GUI)--a way of interacting with the computer that did not involve typing code on a command line. Potentially such an interface could be made into an entirely intuitive experience that would open up computing to a whole new audience.

Jobs was wowed. This, he thought, was the future of computing. And thus was born the Macintosh.

Or so the stroy goes. But the fact is that Jobs was not the first person to think that GUI was the way to go. Many people thought that, including Raskin, who wrote his dissertation on the topic and had been clamoring at Apple meetings for a gui-based "everyman's computer" months before the PARC visit. Steve Jobs did not invent the GUI, nor was he the first to advocate for it even within Apple.

Everyone who thought computers had a future as a consumer product, and with the success of the Apple II, that was most intelligent observers, knew that the GUI was the key to that future. Where Jobs differed from many others (but not Raskin) was in his determination to make that future happen soon.

The resulting products, though, the $10,000 upmarket Lisa and the $2500 mid-market Mac, were failures. The Lisa primarily because businesses wouldn't make the jump to this wholly new kind of computer with very few programs written for it. The Mac becuase the only thing it did well was demonstrate the concept of the GUI.  It's measley 128K of memory made it little more than an intriguing toy. And it had not provided for expansion--a motherboard replacement was the only way to expand. The next version of the Mac had 512K, quadrupling the memory and making it all the more obvious that the initial release was an unconscienable compromise. And again, Jobs's hostility to fans had its effect: once agian there were persistent problems with overheating.

Far from being a revolution, the Macintosh was a small player in the PC market and Apple continued to derive most of its sales and revenue from the Apple II series. One wonders how the computer would have turned out if Raskin's initial concept would have carried the day, rather than Jobs's Lisa/Macintosh composite.

The Mac was a big step forward for the GUI. But it was not a revolution. If it were, I'd be typing this on a Mac, and Mac would more than a few percent of the PCs being used worldwide.

The iPod & iTunes

Just for reference: mp3s were already being traded online before the iPod. Mp3 players existed before the iPod. Some fairly good ones (Diamond Rio, for instance). The basic idea goes back to 1996, with the introduction of the Audio Highway Listen Up, which was never produced en masse, but has all the basic design and functionality elements we associate with mp3 players. When introduced, the iPod raised the bar in terms of design, mainly through a partnership with Toshiba, who were pioneering ultra-small hard drives. Competitors quickly caught up. This hardly seems the stuff of secular sainthood.

iTunes has been touted as "the turnkey solution" in the field. But the solution for whom, we might ask? It certainly has not been a turnkey solution for the music industry, which makes too little money off iTunes to staunch the bleeding from illegal downloading. It certainly isn't the solution for knowledgable users, for whom iTunes and Apple's obsessive attempts to control every aspect of consumer experience (and to get a cut of every expenditure) are an encumbrance, not a liberation. Though it IS a trunkey solution for Apple, which has made a great deal of money from it. Here again, from a business perspective, this is perhaps to be admired--like Bill Gates is to be admired for the empire he built with crappy software like Windows. But you don't get sainthood in my book for making a big pile of money.

Design

Aside from making loads of money, the other thing anyone must acknowledge is Jobs's prioritization of design in computers and electronic products.

The first Mac was thoughtfully designed to be "welcoming." Even the Apple II looked distinct from an IBM Selctric typewriter, say. But at the risk of sounding like a philistine, so what? Even if we acknowledge they were somehow more welcoming with their softer corners, of which I'm doubtful, these machines are still ugly. And that welcoming face is completely subjective. A welcoming Apple II did nothing to help someone who didn't know BASIC or basic DOS commands. A welcoming Mac did nothing for a person who couldn't figure out a damned thing to do with it. And "welcoming" wasn't much comfort when the damned thing overheated.

The iMac was Jobs's first foray into truly extravagant design, and it was a pretty big success for Apple--note the "for Apple," meaning for a company that had not had a successful product in years. But mac still represents less than 10% of PC market share even today, after years of success. HP's is more than 18.

I had a couple of these machines around the office until recently, and let me tell you, the jellybean inspired designs do not age well. They are hideous, bulky and awkward. As is the lamp iMac of a few years later, though later flatscreen iMacs at least do not offend.

The basic concept that Jobs is pushing is the computer as "appliance". But the problem is that the computer is not essentially an appliance--the level of its functionality is microscopic (electrons getting shuttled here and there), so it's human scale form can never match its function, and every gesture toward form meets function is very much recognizable as annoyingly dated whimsy after a few weeks.

The all-in-one flatscreen iMacs finally overcome this problem though by eliminating the box rather than pointlessly trying to aestheticize it. This minimalism, in the hands of Jobs and the eyes of his beholders, becomes an aesthetic in itself. But the resulting computers are pretty consistently mediocre in terms of performance. Good enough for casual users. Not so good for anything requiring heavy lifting.

The flair for design may be better applied to electronic devices such as phones and mp3 players. But if these designs are so sublime, one wonders, why do they have to change every 18 months of so? Are these vaunted designs any more important than the curvaceous fenders on a 1977 Matador? and is a "passion" for one any less silly than a passion for the other.

And it is here that we find the real heart of Jobs's insight: how to play into our fetishism for shiny new objects, sleek, minimalistic, but with significant changes in motif . . . a change from say a look suggesting obsidian to a look more suggestive of a brushed metal, futuristic, industrial-aesthetic contrivance, then perhaps back to the computer meets comestible look. the constant changes fuel sales from novelty addicted clients, for whom these toys provide some semblance of greater meaning--some promise of a future where technology will somehow intervene to solve all our personal problems. And these toys have to look magical, even if only temporarily, in order to support the heavy, if unacknowledged, symbolism.

Reading too much in? How else to explain the irrational vehemence of the Apple acolyte? The I-shrines dedicated to a man an admitted asshole who, truly, did very little to become a saint and very little if anything of a truly revolutionary nature? Why else does anyone care so much? Why else do writers grasp after the grandiose but insistently non-specific when they lionize him?

Our Utopia has changed from being a place to being a kind of social order to, finally, being a magical item. Perhaps this is a utopianism without the dangers of Hitler or Stalin, but it also without the promise of the Enlightenment or, closer to home, the civil rights movement. Sad to see, really.

Sunday, October 03, 2010

The New Romans

Thomas Friedman has an editorial in today's New York Times about the decline of the two-party system and the possible rise of a new third party in the next round of presidential elections. A third party that would say
“These two parties are lying to you. They can’t tell you the truth because they are each trapped in decades of special interests. I am not going to tell you what you want to hear. I am going to tell you what you need to hear if we want to be the world’s leaders, not the new Romans.”
This refers back to a passage he quotes from Lewis Mumford:
“Everyone aimed at security: no one accepted responsibility. What was plainly lacking, long before the barbarian invasions had done their work, long before economic dislocations became serious, was an inner go. Rome’s life was now an imitation of life: a mere holding on. Security was the watchword — as if life knew any other stability than through constant change, or any form of security except through a constant willingness to take risks.”
 There are a number of big problems with Friedman's analysis of our situation. For one thing it comes from Thomas Friedman who is perhaps second only to Bill Kristol in being consistently wrong about things (think Enron & Global Crossing, web stocks, the Iraq War . . .).

Second, this analysis comes out of more time spent by Friedman hanging out in Silicon Valley. One fundamental mistake Friedman makes is consistently misreading the nation's problems with Washington as being the same as those of Silicon Valley executives. Actually, the two sets of problems are quite different. Whatever they might say, the business people just want government to work: they want it to keep the people quiescent, to see to the education of kids and the protection of property rights, and to keep the business environment relatively stable.

They don't want anything revolutionary or anything that will open the doors to sweeping changes. What business wants is a return to the bipartisan consensus building that we've been running on for more than 60 years.

Now, IF a third party were to emerge out of the entrepreneurial/ established high-tech business interests that have been picking up Friedman's bar tab lately, they would only accuse the two established parties of lying as part of an elaborate sham to establish a new party that would then proceed to go straight back to business as usual.

Trouble is, that's precisely what Obama tried to do: sell himself as a change agent and then just run things more or less as they've always been run. What went wrong? Well, the economy, stupid, for one thing. People are discontent and scared and looking for reasons why. Another is the brinkmanship the Republican party has now embraced--the willingness--eagerness--to make the entire country fail if that failure is blamed on the other side. A third is the now far more active fecklessness and stupidity of the American electorate--who elect a man on a platform of change, quail at any large changes that might be proposed and then blame that man for not effecting change. This is a pathological and deeply irresponsible electorate. The ungovernable-ness that grips California--the inability to support either stasis or change, let alone negotiate a dilemma--now grips the country as a whole.

We ARE the new Romans, only we  aren't the Romans of AD 200 or so,  losing our "inner go" to fend off those barbarians and keep rolling back the frontier. We are more like the Romans of a few hundred years earlier who lost their their "inner go" to make tough decisions for themselves, or even to put on the semblance of doing so. We are the Romans on the road to civil war.

This is not a problem that will be solved by a presidential candidate from a new party. The crisis we averted a year or so ago was, indeed, caused by an elite of reckless high-stakes gamblers. But in the end, that elite showed they could pull their fat from the fire in the last instance. This sort of problem might perhaps be better addressed by a new party.

But our crisis now is different. Our problem is that the populace has gotten extremely comfortable with a life of economic and political parasitism and they really have no comprehension whatsoever of the system they've been living off of. They only comprehend that they always want to collect winnings at the tables, to enjoy an always growing economy and to celebrate military victories won by someone else's children . . . and they want to feel morally righteous doing so.

This is not a crisis that calls for truth, this is a crisis that calls for better lies. And I don't think Thomas Friedman or Silicon Valley has got them.