Monday, February 28, 2011

Freakonomics and geeks

This article in the Boston Review is well worth reading, although not really packed with surprises. One part that caught my eye is this:
Posner brings the economics profession to task. The embrace of rational expectations and the efficient-markets hypothesis, and the tendency of the discipline to reward impressively sophisticated and utterly implausible models, contributed to the ideational environment in both business and government that led us off this cliff.
The reference to the "impressively sophisticated and utterly implausible" reminds me of some things that have been said about David Benatar's book Better Never to Have Been (see the discussion at The HEP Spot, for instance, as well as Christopher Cowley's paper.) Here's part of the description of the book from amazon:
David Benatar argues that coming into existence is always a serious harm. Although the good things in one's life make one's life go better than it otherwise would have gone, one could not have been deprived by their absence if one had not existed. Those who never exist cannot be deprived. However, by coming into existence one does suffer quite serious harms that could not have befallen one had one not come into existence. Drawing on the relevant psychological literature, the author shows that there are a number of well-documented features of human psychology that explain why people systematically overestimate the quality of their lives and why they are thus resistant to the suggestion that they were seriously harmed by being brought into existence. The author then argues for the 'anti-natal' view---that it is always wrong to have children---and he shows that combining the anti-natal view with common pro-choice views about foetal moral status yield a "pro-death" view about abortion (at the earlier stages of gestation). Anti-natalism also implies that it would be better if humanity became extinct. Although counter-intuitive for many, that implication is defended, not least by showing that it solves many conundrums of moral theory about population.
"Although counter-intuitive for many" is as fine an understatement as you are likely to read today.

My undergraduate degree was in Philosophy, Politics, and Economics, and involved taking courses in all three of these subjects. After a year of this we had to choose just eight more courses to take, and these did not need to be in more than two of the three subjects. I dropped Economics, because it seemed to offer little more than incoherence. Most of the textbooks in the subject taught that Keynesianism was demonstrably false. The alternative (was presented to me as though it) involved monetarism, which was also (supposedly) demonstrably false. I was given a newspaper article with the headline "Monetarism is Dead" explaining why this policy was impracticable (and had been effectively abandoned by the officially monetarist government), as well as a long mathematical demonstration that it was based on faulty a priori reasoning too. So no one -- neither the Keynesians nor the anti-Keynesians -- seemed really credible, and the mathematics involved was all based on highly implausible assumptions of perfect information, pure self-interest, etc. No economist was so stupid as to insist that these assumptions were really true of the world, but they generally held that this did not matter because things tended to go that way. So the results of economic calculations should be roughly accurate. I think there is some reason to doubt that this is correct (and some economists share this skeptical view), which is why I think undergraduates should be taught economic history before they study economic theory, but even if we do put our faith in a priori  economics, there remains the question, Whose economics? The Keynesians or the anti-Keynesians (or the neo-Keynesians)? I remain skeptical about the whole business.

Anyway, the point I want to make is less about economics and more about what might be called geekiness or the love of the cool. The sophisticated but implausible is often found to be exciting by a certain kind of person. It is pretty much the opposite of what Cora Diamond has in mind when she talks about the realistic spirit. Philosophy done in the spirit of Diamond's work can be sophisticated and exciting too, but it is only going to be implausible in a limited sense, i.e. to someone in the grip of a theoretical picture. That might sound like begging the question, but Diamond's views (and those like them) do not have the kind of profound (or common sense) implausibility that one (seemingly--I still haven't read it) gets in Benatar's book, for instance. The idea that Wall Street does not need more regulation is commonsensically implausible too, it seems to me.

So really I have two points. The first is that, if Posner is right, then a geeky preference for the unrealistic has done real and immense harm to the world economy. The second is that maybe we could change this kind of preference, or ensure that disciplines are less dominated by people with such preferences, if we educated people differently.

I'm not sure about this, but I wonder whether courses in the humanities might serve as a useful kind of gatekeeper. Maybe that would be unfair. And maybe it wouldn't work because the humanities themselves are already influenced by geeky thinking--"there is no such thing as truth," for instance, seems both implausible and defended by sophisticated arguments, and it is an idea familiar to most professors of literature. But now I sound like a grumpy old man railing against ideas from France and the 1960s.

Another alternative might be to discourage (not ban or abandon, but draw back from) a priori reasoning and theoretical models and encourage empirically-based work instead. But that is too easily misunderstood as a rejection of the humanities. Empirical work without empiricism might be what I mean, so that ordinary experience and the kind of experience captured in literature would count as important too.               
 

5 comments:

  1. Two thoughts: (1) a colleague of mine, resisting a certain physics-driven account of colors, was told: "You must think our knowledge of colors is a priori." To which my colleague responded: "No, I think we *see* them." (2) It doesn't get much play these days, but what you say reminds me in some ways of Virgil Aldrich's distinction between the funsters and the workers in philosophy, the former being drawn especially to the "impressively sophisticated and utterly implausible"

    ReplyDelete
  2. Thanks. I like both thoughts and will have to look up Virgil Aldrich.

    ReplyDelete
  3. I don't how many people will be able to use these links (they aren't really links, of course, but you can cut and paste easily enough), but my search for Aldrich quickly led me to these: http://www.jstor.org/stable/2023489?seq=1 and http://www.jstor.org/stable/20009546. They look like gems.

    ReplyDelete
  4. Since my access to JSTOR is by means of an academic portal, it took some hacking to find the corresponding URLs; but having found them and copied the pertinent passages into a file, I thought it worth while to reproduce them here:

    (1) Virgil C. Aldrich, "Reflections on Ayer's The Concept of a Person," The Journal of Philosophy, Vol. 62, No. 5 (Mar. 4, 1965), pp. 111–12:

    In this light, I like to think of the Ayers as the "funsters" in philosophy, and the Wittgensteins as the "workers." The funsters make unto themselves idols of the theater and the market place, not to worship, but to play with in all seriousness and absorption, like a child. For them, the point of such idols is that they idle. They are not instruments to live or work with. The human spirit comes into its own in the serious speculation concerning them, a consideration that makes them appear to be "real possibilities." The workers are the iconoclasts in this regard. No concept, no bit of language, must be left idling. The workers work to lay the ghosts spawned by the idling of language and thought, in favor of the realities with which, and by which, men live. The history of philosophy is the story of the conflict between the funsters and the workers in philosophy.

    (2) Virgil C. Aldrich, "Sight and Light," American Philosophical Quarterly, Vol. 11, No. 4 (Oct., 1974), p. 317:

    The funsters in philosophy are those who like to stagger the imagination with notions of what is logically or theoretically possible, stating this in a way that suggests not only that these suppositions are contradiction-free, but that they might be actually true of some existing state of affairs not only elsewhere—say on Mars—but even right here and now. What gives the green light for such procedure is the assumption that, given a cluster of terms naturally affiliated in plain talk, what each of them "means" is nevertheless logically independent of the others, such that combinations of them in declarative sentences express what is only contingently true. Thus, for example, what "see" means is said never to be conceptually tied with what "eye" or "light" means. So, it is theoretically (logically) possible for someone to see something without light or eyes. The philosophical funsters, then, are those who tend to think of each concept as having its own essence, which logical autonomy allows it to be meaningful even when thus construed in abstraction from natural affiliations with other concepts, as it is when applied to something to which the affiliates can not be significantly applied. Plato and Descartes are magnificent exemplars of this style of thinking, and they have followers who currently carry on as if Aristotle, Wittgenstein, and Austin had never lived. (Kant has a leg in both camps; so has Strawson.)

    ReplyDelete
  5. Thanks, MKR! That's what I should have done.

    ReplyDelete