Posner brings the economics profession to task. The embrace of rational expectations and the efficient-markets hypothesis, and the tendency of the discipline to reward impressively sophisticated and utterly implausible models, contributed to the ideational environment in both business and government that led us off this cliff.The reference to the "impressively sophisticated and utterly implausible" reminds me of some things that have been said about David Benatar's book Better Never to Have Been (see the discussion at The HEP Spot, for instance, as well as Christopher Cowley's paper.) Here's part of the description of the book from amazon:
David Benatar argues that coming into existence is always a serious harm. Although the good things in one's life make one's life go better than it otherwise would have gone, one could not have been deprived by their absence if one had not existed. Those who never exist cannot be deprived. However, by coming into existence one does suffer quite serious harms that could not have befallen one had one not come into existence. Drawing on the relevant psychological literature, the author shows that there are a number of well-documented features of human psychology that explain why people systematically overestimate the quality of their lives and why they are thus resistant to the suggestion that they were seriously harmed by being brought into existence. The author then argues for the 'anti-natal' view---that it is always wrong to have children---and he shows that combining the anti-natal view with common pro-choice views about foetal moral status yield a "pro-death" view about abortion (at the earlier stages of gestation). Anti-natalism also implies that it would be better if humanity became extinct. Although counter-intuitive for many, that implication is defended, not least by showing that it solves many conundrums of moral theory about population."Although counter-intuitive for many" is as fine an understatement as you are likely to read today.
My undergraduate degree was in Philosophy, Politics, and Economics, and involved taking courses in all three of these subjects. After a year of this we had to choose just eight more courses to take, and these did not need to be in more than two of the three subjects. I dropped Economics, because it seemed to offer little more than incoherence. Most of the textbooks in the subject taught that Keynesianism was demonstrably false. The alternative (was presented to me as though it) involved monetarism, which was also (supposedly) demonstrably false. I was given a newspaper article with the headline "Monetarism is Dead" explaining why this policy was impracticable (and had been effectively abandoned by the officially monetarist government), as well as a long mathematical demonstration that it was based on faulty a priori reasoning too. So no one -- neither the Keynesians nor the anti-Keynesians -- seemed really credible, and the mathematics involved was all based on highly implausible assumptions of perfect information, pure self-interest, etc. No economist was so stupid as to insist that these assumptions were really true of the world, but they generally held that this did not matter because things tended to go that way. So the results of economic calculations should be roughly accurate. I think there is some reason to doubt that this is correct (and some economists share this skeptical view), which is why I think undergraduates should be taught economic history before they study economic theory, but even if we do put our faith in a priori economics, there remains the question, Whose economics? The Keynesians or the anti-Keynesians (or the neo-Keynesians)? I remain skeptical about the whole business.
Anyway, the point I want to make is less about economics and more about what might be called geekiness or the love of the cool. The sophisticated but implausible is often found to be exciting by a certain kind of person. It is pretty much the opposite of what Cora Diamond has in mind when she talks about the realistic spirit. Philosophy done in the spirit of Diamond's work can be sophisticated and exciting too, but it is only going to be implausible in a limited sense, i.e. to someone in the grip of a theoretical picture. That might sound like begging the question, but Diamond's views (and those like them) do not have the kind of profound (or common sense) implausibility that one (seemingly--I still haven't read it) gets in Benatar's book, for instance. The idea that Wall Street does not need more regulation is commonsensically implausible too, it seems to me.
So really I have two points. The first is that, if Posner is right, then a geeky preference for the unrealistic has done real and immense harm to the world economy. The second is that maybe we could change this kind of preference, or ensure that disciplines are less dominated by people with such preferences, if we educated people differently.
I'm not sure about this, but I wonder whether courses in the humanities might serve as a useful kind of gatekeeper. Maybe that would be unfair. And maybe it wouldn't work because the humanities themselves are already influenced by geeky thinking--"there is no such thing as truth," for instance, seems both implausible and defended by sophisticated arguments, and it is an idea familiar to most professors of literature. But now I sound like a grumpy old man railing against ideas from France and the 1960s.
Another alternative might be to discourage (not ban or abandon, but draw back from) a priori reasoning and theoretical models and encourage empirically-based work instead. But that is too easily misunderstood as a rejection of the humanities. Empirical work without empiricism might be what I mean, so that ordinary experience and the kind of experience captured in literature would count as important too.