My title here is a phrase strongly associated with a controversy stirred up by CP Snow back in the Space Age, when many in the West feared we were falling behind the Soviets in science and technology. For the English and the American upper class, part of this worry was related to the bias of their higher education systems toward non-technical subjects. What Snow observed was that as our culture matured, the most highly educated people in the humanities tended to know scandalously little about science; and likewise, those with the greatest technical knowledge tended to be rather uncultured.
Snow was well aware that there were always and would always be exceptions to this general tendency. In fact, Snow himself was one of those exceptions, being both a respected novelist and a student of physics (he earned a doctorate in the subject). So the many takedowns of the Two Cultures idea you may find online that say, point to JBS Haldane--look, a mathematically minded scientist who could write! who knew Greek! who wrote history!--as proof there aren't two cultures, are completely missing the point. Haldane and Snow were exceptional. Snow didn't argue that it was impossible to inhabit both cultures--being who he was, how could he?--only that it was not the norm.
In speaking of Snow, we should also acknowledge how much his piece is of its time and place: Part of Snow's thesis, an important part, was his particular attack on the English school system as a source for the split between technical and cultural knowledge. (He actually holds up the American university system as a positive counterexample.)
However, the basic conflict he is observing goes much farther back, to the very beginnings of the modern world and the conflict between the ancients and the moderns in the 17th century. That conflict was essentially a conflict between ancient wisdom and modern, mostly scientific knowledge. Swift's Battle of the Books was a satiric look at this conflict which came down, as we might expect from a literary man who hated math, pretty heavily on the side of ancient wisdom. Over the long haul, though, science has mostly won this battle--shaping and changing our world to an extent that even Swift could hardly have imagined.
And, worse still from Swift's point of view, science is increasingly the arbiter of truth in our society.
But not the only one. Anyone taking a look at say, the controversies over global warming can see that science often has an uphill battle against "common sense" when its truths are inconvenient. Science is, no doubt, still the servant of our desires; though a servant upon whom we are as dependent as Wooster is upon Jeeves.
And yet our universities are--still!--filled with people who smilingly admit to incompetence in basic mathematics; who don't know anything about science aside from the fact that its advances sometimes harm the environment; people who criticize science but who cannot distinguish the real thing from ridiculous parody. If Bertie Wooster were a doctrinaire ingrate as well as an ignorant fool, he'd be the model for many of our present day humanities professors. But such a character could never win even the provisional sympathies of any reader.
Such a response to science does little honor to the tradition that Swift defended. As was the case in Snow's day--the worst offenders in the two culture business are on the side of the humanities. One is far more likely to find an articulate and cultured scientist than a scientifically knowledgeable humanities type. There are more Goulds and Lewontins and Orrs out there than there are George Levines.
Which is a shame. Because there are also scientists out there who have very little knowledge of or respect for the Western tradition who now want to explain it all for us. Who don't seem to appreciate that you cannot explain "it all" without a conception of what "it all" is. Who don't seem to realize that there simplistic explanations have been around for a long, long time, and have been roundly and soundly rejected by those to whom the phenomenon in question is most familiar.
One of the things I hope to see in future is a generation of humanities and social science people who embrace science but are not in thrall of it.
Sadly, I haven't seen much that looks much like that, though.
Sunday, December 11, 2011
Sunday, December 04, 2011
Our Runaway Economy
I was intrigued by the dueling opinion pieces by Paul Krugman and David Brooks in this past Friday's NYT (02.12.2011). Down the right edge Paul Krugman makes his Keynesian, technocratic case for government intervention, in the fat bottom piece David Brooks extols the Germans for defending "values," "effort," "self-control," "merit," and "enterprise" in resisting that same intervention.
While Krugman's knowing, arrogant tone have long since worn thin on me, he's still a) an actual economist and b) has so far been consistently right on how this crisis would play out, as opposed to, say, folks like Niall Ferguson, who have been just as consistently wrong.
Brooks is of course right when he says that there is a political cost to the "value blindness" inherent in the purely technocratic calls for crisis intervention right now. When irresponsible governments get "bailed out" it does call the legitimacy of the status quo into question (thus, the tea party movement).
But, as Brooks acknowledges over and over again, we are in a crisis, and crises like these demand decisive actions that may or may not comport with your general morality. What crises demand is action, not actions contingent upon something else happening. If, as agreed, we have a crisis, then we need action to stave it off. It's as if I saw a car beginning to roll freely down a hill: my first action is not to seek out the driver to make him or her promise to use the handbrake in future. My first action is to see if this person was also irresponsible enough to leave the door unlocked.
Is it fair that I should have to do this? No. It's what you do to prevent a calamity.
And that gets to one of the funny little things about living in a complex society--questions like "Is this fair?" or, more generally, "Does this comport with how I conduct myself or my family life?"--are often the wrong sorts of questions to be asking. Why? Because the point of the system isn't to be fair. The system wasn't made to retell the story of Pilgrim's Progress or Horatio Alger. And this has always been the case with how the managers of that system have made their decisions.
Capitalism over the last 100 years or so has gone through some interesting developments--it's interaction with the more inclusive democractic political system has become both fruitful (witness, the postwar economic boom in the West) and more fraught (witness, the post-1980 shift in reward structure, the Occupy Wall Street movement). For people like Brooks the story of our economic system--the deserving are rewarded and the undeserving punished--is more important than tending to the technical function of the system because of that now very strong interaction between democracy and capitalism.
But the fact is, that story is a lie and always has been a lie. Are there strategies that you can find out about that are likely to lead to success in our system? Yes, certainly, particularly when it is working well. Do those strategies necessarily have something to do with deservingness by some other yardstick (moral, utilitarian)? No.
A crisis is not a time to try to shore up tired old lies: it is a time when we ought to be being a bit more honest with ourselves. If "virtue rewarded and vice punished" is what you are looking for from your economic system, capitalism is not your baby--we can condition the competition and power plays of capitalism so that we reach this outcome more often, but we have to do it. It won't do it itself. It isn't designed to do so.
And there are absolutely no economic rewards for virtue in and of itself. If your virtue turns out to be economically non-viable, you don't get an economic reward. The moral/political realm and the economic realm are separate. They interact constantly, but we should really stop encouraging people--as Brooks is urging--to think they are the same thing.
This is really part of a collective growing-up we've got to do, akin to discovering that your parents were not the paragons of the virtues they so strongly urged on you. A crisis is a time for a bit more truth. Let's acknowledge that preventing the calamity and the virtue of the people rescued from it are two separate issues to be dealt with in their own proper occasions. So lets see if that driver's side door is open, shall we?
Thursday, November 10, 2011
First, Let's Get Rid of the Economic Lawyers
I was amused by an exchange between Ron Paul and Ben Bernanke that happened this past summer but I only came across recently:
Paul: Do you think gold is money? Bernanke:(pregnant pause) No. It's not money? It's a precious metal. Even if it has been money for 6,000 years, somebody reversed that and eliminated that economic law? Well, it's an asset. Would you say Treasury bills are money? I don't think they're money either, but they're a financial asset. Why do central banks hold it? Well, it's a form of reserves. Why don't they hold diamonds?Well it's tradition -- long-term tradition. Well, some people still think it's money. In the U.S. anyway, those people are wrong.
From Daniel Indiviglio's blog at The AtlanticThe economic law that gold is money? I must have missed that. Is it in the Bible or something?
Amusingly, Ron Paul's fanboys think that this exchange somehow displays Paul's fantastic grasp of basic monetary issues and Bernanke's inability to deal with him. Anyone watching the video of their exchange will see Bernanke is clearly bored during Paul's Long run-up to this bit of questioning.
What all of this boils down to is that Paul believes gold has transcendent value. As established by God, no doubt and that value is to be measured only in gold. Or should we say Gold? This is merely arbitrary. An alternative fiat to fiat currency, and a bad alternative for reasons I'll explain below.
And Paul thinks that value itself has to be a transcendent. He offers no reason why this should be so.
Economic value is contingent. It always is. It is less contingent in a developed economy such as ours, but it is contingent. A smallholder brings in 100 bushels of wheat one year, and lives comfortably off the proceeds one year, the next year he brings in 150 and barely scratches a living. How could this be? 150 is bigger, and therefore more valuable than 100, no? But having 150 bushels of wheat in a year when there is too much wheat is worse than having 100 when it is scarce.
The one economic law we should all know well, the law of supply & demand, the basic observation behind capitalism, says "value is contingent." And that law applies to everything, Gold included.
Which means that the wild fluctuations we've seen in the price of gold in USD over the past 40 years or so are NOT just a reflection of the value we put in the dollar, but also a reflection of the contingent value of gold as a commodity. In a flight to quality, gold goes way up and US bond rates way down because there is additional demand for those reputed safe harbor investments, this over and above any implied valuation of the currencies involved.
Now why does gold have a reputation as a "quality" investment? Because the supply is relatively stable and predictable.
So why not tie the value of your currency to gold? Because one of the big reasons you have a currency is the first place is to enable you to manipulate its supply to deal with economic contingency. If you tie the value of your currency to gold, then you've taken this possibility off the table.
And why would you want to manipulate the supply of currency? Because economic value is not measured in Gold, it is measured in human well-being, in people eating, in their having a warm & dry place to stay, in having employment and enjoyment, in having some security in the future. That's "value." And while this definition may sound a little bit nebulous, the fact is that this gets to the real gist of the matter, where merely posited abstract, supposed absolutes, like Gold, just defer the question infinitely--isn't Gold shiny enough to make you stop asking questions?
Value is rooted in perceived human good. Full stop.
Money is an exchange medium which creates a common index to measure value of very different things, because it--money--can be exchanged for many different things--a weight of gold, some barley, a night with a prostitute, a slave, a jug of wine, a pair of shoes, the services of a porter for a day. However, because all of these things are subject to the law of supply & demand, including money itself, the value, as measured in this common index, fluctuates constantly.
Money is a representation of value. Value fluctuates constantly. And the value of the representation itself is subject to change. If the representation is easily counterfeited, it's value will inevitably fall as people do just that.
Hence gold--it is difficult to counterfeit and supply was pretty stable and it thus became a great medium for early exchange. But gold currency had its problems, too. Gold was very scarce and it was difficult to make coinage small enough to represent the value of many many trades. (Most people in the year 1000, say, would never have used a gold coin for any transaction for their entire lives.)
And since weighing coins and calculating for purity was an inconvenient process to undertake at each exchange, coin the clipping and debasing of coinage could be quite profitable enterprises for both governmental and non-governmental entities.
And the supply wasn't always stable. A major strike could wreak havoc on the value of gold as measured in (at the time) more stable commodities. For instance, after the discovery of America, or the California discoveries circa 1850, gold and silver were suddenly much easier to come by in Europe, causing inflation.
And so money was ever a problem, not just in the household economics, but also in any exchange. Coinage was non-standard in every way. The solution to this problem was representational currency. The currency would be essentially valueless in itself, but it would represent and be exchangeable for gold, which represented actual value in a hard to counterfeit way.
But that didn't solve all the problems with Gold. Over the short term, tying your economic policy to gold mean long term price stability, but relatively violent fluctuations over the short term. And since we've all got to put food on the table and pay next month's rent in the short term, this wasn't so good for regular folk. For people who were stashing away large sums of money off of which their grandchildren would live idly, the sub 1% long term inflation rate was great. But near 60% (+40/-20) variations in the annual inflation rate were hardly the stuff that made wage-earners smile and love the gold standard. And they didn't.
In short, a government that cared about wage earners and small holders needed to be able to act to create short-term economic stability, even if that was at the expense of some erosion of long term price stability. This isn't so bad a trade, and the proponents of tying money supply to the gold supple ought to be a bit more honest about what the gold standard brought us--lots of volatility, big increases and decreases in prices that tended to cancel each other out in the long run.
If what you long for is more short-term stability, for your dollar to be worth the same this year as last or two years ago, then Gold is not the answer. In fact, it is precisely the opposite of the answer.
Contingency can't be escaped. Gold is not God--He's said as much. The Gold standard does not bring transcendent truth to our financial transactions, it is only mental children who think so, or even hope for it. Financial transactions depend on other people. There is no guarantor for your wealth or goods against any and all contingencies.
Speaking of children, watching the video of the Bernanke/Paul "showdown" really impressed me with how childlike Ron Paul can be. Paul querying (and wearying) Bernanke is just like watching a toddler trying to find the man inside the TV.
Well folks, there is no man in the TV, there is no ultimate truth behind a dollar bill. Just our collective promise and the likely prospect that others will accept it just as you did. Paul thinks that kind of social trust is a Ponzi scheme. Actually it is life as fully cognizant adults live it.
There is no "law" that says only gold is money, just as there is no "law" that says your parents will always be there for you, or that you will always be the center of the universe. Your parents will die, you are one person among billions, and you will have to face and deal with life contingencies one way or another.
But, hopefully not with tiresome delusions like Gold.
Monday, November 07, 2011
The Education Bubble
http://newssun.suntimes.com/news/8655507-418/analysis-is-student-loan-education-bubble-next.html
The link above is to a relatively thoughtful and thought-provoking article from the AP on the current expansion in higher education & higher ed financing during this financial crisis. Much of this is only to be expected when people lose high-paying manufacturing jobs that aren't coming back and look to tool-up for something comparably paid but quite different in the future. But there are, as the article points out, some signs for a little bit of worry about this new wave of college attendees and the debt they are accruing.
The link above is to a relatively thoughtful and thought-provoking article from the AP on the current expansion in higher education & higher ed financing during this financial crisis. Much of this is only to be expected when people lose high-paying manufacturing jobs that aren't coming back and look to tool-up for something comparably paid but quite different in the future. But there are, as the article points out, some signs for a little bit of worry about this new wave of college attendees and the debt they are accruing.
Perhaps the least interesting part of the article is Peter Thiel's fellowships. The $100,000 awards handed out to young people *not* to attend college are great headline fodder, but they're pretty meaningless to the typical high school student. Did someone look at your business plan and hand you 100K? Then you might not need to go to college to succeed. In fact, since college can be rather time consuming, it might be better to concentrate your attention on a big idea that has already inspired people to lay down some big money. Otherwise, you might want to remember the single biggest indicator of class divide in most of the country: a college degree or lack thereof.
And while you are at college you might even, like Mark Zuckerberg, stumble on that big new idea.
Peter Thiel's big complaint with higher education is that it gets in the way of entrepreneurship. But the vast majority of 17-year-olds have little or no entrepreneurship to get in the way of. They are not destined to be 22-year-old Internet millionaires. In fact, even in some dreamed of future of constant innovation, most people are going to work for someone else, doing something that is not insanely great or startlingly innovative. Basing education policy on fostering future Peter Thiels is like basing it on fostering future Powerball winners. The winners are going to be few and far between and your education policy will have little to do with fostering them or with increasing their number.
What education policy *can* do is provide those entrepreneurs with employees with some basic knowledge of how business finance works, or how to market their product, or how to do basic accounting, or how to hire and fire without getting sued, or how to read a report, understand what it is saying and ask some pertinent and challenge questions of its authors, or how to write the code that actually does the insanely great things would-be entrepreneurs dream up.
That's what education and education policy is for: to create competency and to build the groundwork for excellence. Education and education policy do not exist to foster the very, very small percentage of young people who actually have salable ideas. We rely on those folks to fend for themselves, as they have already demonstrated they are able to do by coming up with these ideas in the first place.
Which brings me to a second point about the education bubble: there is one, but not for the reason Thiel thinks. Thiel's line of thinking tends toward the notion that higher education is irrelevant in some "new world" of constant innovation and entrepreneurial thinking. Actually, it isn't a world, it is simply a small class of young people who have opportunities that outweigh the advantages of higher education. But the this kind of millenarian thinking (everything has changed! We need to change our approach to everything to comport with the new reality!) goes on everywhere and seems particularly resonant these days. Actually it was a big contributor to several of our recent asset bubbles. (Who can forget the 1990s?)
But the real reason there may be an education bubble is that, like entrepreneurship, higher education is relevant, but only if you are in a position to take advantage of it. The expansion in college attendance and higher ed financing largely represents a new cohort to the groves of academia. Their parents didn't complete a college degree and, crucially, they are themselves poorly prepared, academically, socially & emotionally, to succeed at college.
As I was suggesting earlier, if Thiel were to succeed and got society to foster a significantly larger cohort of young entrepreneurs to develop their ideas, we would quickly start reaching a population of people who didn't have the ideas, the maturity or the personal traits to succeed as entrepreneurs. Traditional higher ed is already at that point: the expansion we're seeing now is reaching a lot of folks who can't succeed in reaping the true benefits of higher education. Largely because our primary and secondary educational systems have served them so badly, but also because their parents have not prepared them to succeed on their own. At anything.
The financial risks here are not really harrowing, and, who knows, there may be enough good bets in this wave of new college attendees to outweigh those who are wasting their (and our) time & money. But, from what I've seen, I'm dubious. And I say this as someone who, like this new crop, had parents who did not graduate from high school and who had to borrow heavily to attend college.
Wednesday, October 26, 2011
Vague-iography: Steve Jobs
I was halfway through Walter Isaacson’s new biography of Steve Jobs when I suddenly went searching through my bookshelf for the book he wrote about Benjamin Franklin. I had read the latter biography when it came out in 2003, and I remembered it fondly. I was trying to figure out why “Steve Jobs,” despite being full of new information about the most compelling businessman of the modern era, was leaving me cold.Joe Nocera, perhaps, is too kindly a man to suggest the reason for his disappointment. It's not in the author (still the same dispenser of "dutiful, lumbering American news-mag journalese" (Sam Leith) he ever was). It's not the difficulties of contemporary biography itself. It's the subject.
Steve Jobs is quite simply not a man who merits an immediate biography. Not a man whose accomplishments are "enormous" and whose significance must be digested immediately. Not a genius.
What he is is a very successful creator of consumer products, not one of which would never have happened without him. And he is the object of a personality cult. A personality cult that, quite frankly does not speak very well of us. Not because Jobs was particularly undeserving of a personality cult--I think we can confidently say that even taking his many faults into account, he looks quite good next to people like Hitler, Mussolini, Mao, Stalin and Castro. No, the reason the Jobs cult makes us look bad is that the cults for people like Mussolini and Stalin were motivated by Utopian dreams--dreams about peace, social unity, the betterment of society at large--which were perverted by and through these cults. (Or, if you will, the Utopian dreams were carried to their natural dystopian conclusions by these opportunists.) The cult for Jobs is an extension of our obsession with toys.
Now it has so long and so often been repeated that Jobs was a "genius"--that he fomented (at least) three different technological revolutions; that he was an "unparalleled innovator"--that by now this sort of statement is taken for granted and it seems perversely contrarian to deny them. But I would point out that proponents of Jobs's genius seldom go into any detail as to what his actual contribution to the technological revolutions was. They are treated more or less as miracles that took place in his presence, and, presumably, due to his presence.
I am going to try be more specific: I am going to take a closer look at the miracles that have led to Jobs's canonization--the personal computing revolution, the GUI-based operating system, Itunes and the Iphone. We'll leave the tablet computer to the side for the moment, as, as far as I can see, it is a technology still in its infancy and its ultimate impact is still pretty unclear.
Before venturing in, some background: I've been using computers since the third grade--1977. I used and programmed Apple IIs and compatibles since shortly after their first introduction. I've used and owned both Apples and Microsoft-based computers since 1990. My early Apple II experience no doubt makes me more of a Woz guy than a Jobs guy.
The Apple II
The Apple II has become emblematic of the personal computing revolution. Deservedly so. This was the first computer that was truly practical for the non-hobbyist to own and use or for schools, where the Apple II became the computer many children first encountered face-to-face, so to speak.
But the Apple II was not the work of Steve Jobs. It was the work of his partner Steve Wozniak. As has been documented many times, Steve Jobs was not a particualrly good electronics or programming man. Wozniak was. And his hacks, innovations and shortcuts are what made the Apple II such a unique machine in the late 1970s. And it was his respect for other people's ability to further hack and adapt the machine that endeared him to many in the early days of home computing. The Apple II had eight expansion slots for cards to be added to expand its capabilities in various ways; the architecture of the machine was open, and software and hardware were freely developed for it. This made the Apple II the beloved of the hacker and open-source communities, which can trace their lineage straight back to Steve Wozniak.
Jobs's contributions to the Apple II itself were small and in some part unwelcome (e.g. Jobs was hostile to cooling fans, so the Apple II had persistent problems with overheating). While Jobs contributions to Apple the business were considerable--Wozniak and the engineers had very little interest in the business side of personal computers. To a large extent the marketing success of the machine is to Jobs's credit. Jobs, for instance, found the first big financial backer for Apple, Mike Markkula. Successfully marketing a revolutionary new product that someone else made is hard work, but not the stuff of miracles.
And Wozniak's innovations would have had considerable impact even if the Apple II was a failure as a product. In other words, *someone* would have used them to try to push computers into the home and school market.
The Macintosh
In 1979, Steve Jobs and Apple engineer Jef Raskin visited Xerox PARC, a new technology development center. There they encountered the graphical user interface (GUI)--a way of interacting with the computer that did not involve typing code on a command line. Potentially such an interface could be made into an entirely intuitive experience that would open up computing to a whole new audience.
Jobs was wowed. This, he thought, was the future of computing. And thus was born the Macintosh.
Or so the stroy goes. But the fact is that Jobs was not the first person to think that GUI was the way to go. Many people thought that, including Raskin, who wrote his dissertation on the topic and had been clamoring at Apple meetings for a gui-based "everyman's computer" months before the PARC visit. Steve Jobs did not invent the GUI, nor was he the first to advocate for it even within Apple.
Everyone who thought computers had a future as a consumer product, and with the success of the Apple II, that was most intelligent observers, knew that the GUI was the key to that future. Where Jobs differed from many others (but not Raskin) was in his determination to make that future happen soon.
The resulting products, though, the $10,000 upmarket Lisa and the $2500 mid-market Mac, were failures. The Lisa primarily because businesses wouldn't make the jump to this wholly new kind of computer with very few programs written for it. The Mac becuase the only thing it did well was demonstrate the concept of the GUI. It's measley 128K of memory made it little more than an intriguing toy. And it had not provided for expansion--a motherboard replacement was the only way to expand. The next version of the Mac had 512K, quadrupling the memory and making it all the more obvious that the initial release was an unconscienable compromise. And again, Jobs's hostility to fans had its effect: once agian there were persistent problems with overheating.
Far from being a revolution, the Macintosh was a small player in the PC market and Apple continued to derive most of its sales and revenue from the Apple II series. One wonders how the computer would have turned out if Raskin's initial concept would have carried the day, rather than Jobs's Lisa/Macintosh composite.
The Mac was a big step forward for the GUI. But it was not a revolution. If it were, I'd be typing this on a Mac, and Mac would more than a few percent of the PCs being used worldwide.
The iPod & iTunes
Just for reference: mp3s were already being traded online before the iPod. Mp3 players existed before the iPod. Some fairly good ones (Diamond Rio, for instance). The basic idea goes back to 1996, with the introduction of the Audio Highway Listen Up, which was never produced en masse, but has all the basic design and functionality elements we associate with mp3 players. When introduced, the iPod raised the bar in terms of design, mainly through a partnership with Toshiba, who were pioneering ultra-small hard drives. Competitors quickly caught up. This hardly seems the stuff of secular sainthood.
iTunes has been touted as "the turnkey solution" in the field. But the solution for whom, we might ask? It certainly has not been a turnkey solution for the music industry, which makes too little money off iTunes to staunch the bleeding from illegal downloading. It certainly isn't the solution for knowledgable users, for whom iTunes and Apple's obsessive attempts to control every aspect of consumer experience (and to get a cut of every expenditure) are an encumbrance, not a liberation. Though it IS a trunkey solution for Apple, which has made a great deal of money from it. Here again, from a business perspective, this is perhaps to be admired--like Bill Gates is to be admired for the empire he built with crappy software like Windows. But you don't get sainthood in my book for making a big pile of money.
Design
Aside from making loads of money, the other thing anyone must acknowledge is Jobs's prioritization of design in computers and electronic products.
The first Mac was thoughtfully designed to be "welcoming." Even the Apple II looked distinct from an IBM Selctric typewriter, say. But at the risk of sounding like a philistine, so what? Even if we acknowledge they were somehow more welcoming with their softer corners, of which I'm doubtful, these machines are still ugly. And that welcoming face is completely subjective. A welcoming Apple II did nothing to help someone who didn't know BASIC or basic DOS commands. A welcoming Mac did nothing for a person who couldn't figure out a damned thing to do with it. And "welcoming" wasn't much comfort when the damned thing overheated.
The iMac was Jobs's first foray into truly extravagant design, and it was a pretty big success for Apple--note the "for Apple," meaning for a company that had not had a successful product in years. But mac still represents less than 10% of PC market share even today, after years of success. HP's is more than 18.
I had a couple of these machines around the office until recently, and let me tell you, the jellybean inspired designs do not age well. They are hideous, bulky and awkward. As is the lamp iMac of a few years later, though later flatscreen iMacs at least do not offend.
The basic concept that Jobs is pushing is the computer as "appliance". But the problem is that the computer is not essentially an appliance--the level of its functionality is microscopic (electrons getting shuttled here and there), so it's human scale form can never match its function, and every gesture toward form meets function is very much recognizable as annoyingly dated whimsy after a few weeks.
The all-in-one flatscreen iMacs finally overcome this problem though by eliminating the box rather than pointlessly trying to aestheticize it. This minimalism, in the hands of Jobs and the eyes of his beholders, becomes an aesthetic in itself. But the resulting computers are pretty consistently mediocre in terms of performance. Good enough for casual users. Not so good for anything requiring heavy lifting.
The flair for design may be better applied to electronic devices such as phones and mp3 players. But if these designs are so sublime, one wonders, why do they have to change every 18 months of so? Are these vaunted designs any more important than the curvaceous fenders on a 1977 Matador? and is a "passion" for one any less silly than a passion for the other.
And it is here that we find the real heart of Jobs's insight: how to play into our fetishism for shiny new objects, sleek, minimalistic, but with significant changes in motif . . . a change from say a look suggesting obsidian to a look more suggestive of a brushed metal, futuristic, industrial-aesthetic contrivance, then perhaps back to the computer meets comestible look. the constant changes fuel sales from novelty addicted clients, for whom these toys provide some semblance of greater meaning--some promise of a future where technology will somehow intervene to solve all our personal problems. And these toys have to look magical, even if only temporarily, in order to support the heavy, if unacknowledged, symbolism.
Reading too much in? How else to explain the irrational vehemence of the Apple acolyte? The I-shrines dedicated to a man an admitted asshole who, truly, did very little to become a saint and very little if anything of a truly revolutionary nature? Why else does anyone care so much? Why else do writers grasp after the grandiose but insistently non-specific when they lionize him?
Our Utopia has changed from being a place to being a kind of social order to, finally, being a magical item. Perhaps this is a utopianism without the dangers of Hitler or Stalin, but it also without the promise of the Enlightenment or, closer to home, the civil rights movement. Sad to see, really.
Subscribe to:
Posts (Atom)