Borrowers rarely default on their loans. Nevertheless, *differences *in default rates have huge effect on rates of return. Suppose, for example, that two lenders charge 3% interest, but one has a default rate of 1% and the other has a default rate of 2%. The first lender has *twice * the rate of return of the second. After all, when the first guy lends out $100, he gets back .99*$103=$101.97, while the second only gets back .98*$103=$100.94.

When labor economists calculate the return to education, however, an analog of default is strangely missing. How is that possible? Simple: Labor economists normally measure the effect on earnings of *successfully completed* years of schooling. In other words, they assume the best-case scenario where every educational investment concluded with a year's worth of passing grades.

In reality, though, people often enroll for a year of school, attend for a while, and then give up. And sometimes they attend for a full year, but get failing grades. Either way, they spend most or all of the cost of a year of education (including foregone earnings), with little or no benefit. It's analogous to defaulting on a loan - you spend the resources, but don't get the return.

The omission matters. If there is a 7% return to successfully completed education, but a 3.5% default rate, the *expected* rate of return is 3.5% - half as big as the naive estimate.

An important corollary is that, properly measured, the higher a student's probability of dropping out or failing, the lower his return to education. Many labor economists make a big deal about the finding that marginal students seem to have a normal rate of return. Brad DeLong writes, for example:

Ceci Rouse and Orley Ashenfelter of Princeton University report that they find no signs that those who receive little education do so because education does not pay off for them: If anything, the returns to an extra year of schooling appear greater for those who get little education than for those who get a lot.

The problem is that these calculations usually assume that marginal students are certain to finish what they start. If they are unusually likely to give up or fail out, then their properly measured return to education plummets. And there's every reason to think that marginal students are more likely to give up or fail out.

Thus, imagine we admit an average college grad to MIT's school of engineering for a year. IF he actually finishes, it's easy to believe that he'll earn a normal payoff for giving up a year of his life. The catch is that it would be a *miracle *if an average college grad could actually finish a year at MIT.

On some level, I suspect that most labor economists would grant my point. If an average high school student got into both Princeton and GMU, it's hardly clear that he's better off going to Princeton. Yes, he'll earn more money if he finishes at Princeton, but he probably won't finish because the competition is too intense.

Once you buy this point, however, it also follows that if an average high school student could either work or go to GMU, he might be better off working. Yes, he'll earn more money if he finishes at GMU, but there is a good chance that he'll invest a lot of time and money, then give up or fail out.

It doesn't help that the copious amounts of financial aid available encourage marginal (and less than marginal) students to attend college at an abnormally high rate.

Three cheers for the incisive Bryan Caplan!

Caplan is such a genius. Of course, people's abilities are fixed. And someone who receives a failing grade is obviously a failure, and obviously did not learn a thing.

For such failures, obviously serving burgers to Caplan when he goes to McDonald's is more efficient, and hence superior, to trying to advance themselves. If you are an inferior person, you are a serf, meant to serve the likes of the superior Caplan. The sooner you accept that, the more efficient life will be.

If only everyone could be as insightful and as brilliant as Caplan.

So, instead... first wasting 1-4 years of their lives and going neck-deep into loans and THEN serving burgers to Caplan is better for them?

Youre a very brilliant man. I await your future posts with unfettered anticipation and awe.

I'm currently taking a financial accounting class full of people who think that taking two tries to get a passing grade in financial accounting is nothing out of the ordinary. Maybe the professor should take a sidetrip over to microeconomics and give them a lesson on deadweight loss.

Isn't the Princeton/GMU argument essentially the same one Thomas Sowell was making about the affects of affirmative action almost two decades ago? Still a brilliant observation.

Bryan's point is particularly valid for law school. A friend of mine spent three years at law school, then about seven years trying to pass the bar exam, working as a hospital orderly to pay his bar exam prep bills, before finally giving up. That's eight years of his life down the drain. He had an engaging personality but just wasn't smart enough to be a lawyer. If he'd become a salesman at age 22, he might have been making six figures by the time he was 32, instead of $10 an hour.

Affirmative action in law school admissions means this syndrome is particularly hard on blacks: 53% of the black students who enter law school fail to qualify to become lawyers, versus 24% of white students. For all the numbers from Richard Sander's research, see:

http://isteve.blogspot.com/2006/09/sacrificing-smart-black-kids-on-altar.html

An excellent point - I look forward to seeing a more careful returns to education analysis that includes this effect.

Bryan,

Top notch, truly.

Well, one obvious lacuna here is tuition. So, another reason one might go to GMU and not Princeton, besides the higher probability of getting through to graduation at GMU, is that if one is an in-state student, the tuition is much lower, like on the order of $100,000 in aggregate, if not more.

Very good point. The only counter-argument is that some of these data does include uncompleted education, such as surveys that ask how many years of study you have. If the extent that the years spent without graduating do not increase earnings and are included in the study they will reduce the payoff of education. Of course it may be that there is a bias for not reporting the “dropout” year.

The real problem with liberals such as Delong is that they do not seem to understand that these are equilibrium results: The return to education for those that *chose* to get educated. It hardly means we can expand education for everyone and like sit around like dimwits expecting the return to stay the same. Serious research on GEDs already demonstrated this fallacy.

Since there are already massive subsidies and loan opportunities that allow almost everyone to go to school, people are probably choosing the right amount (or probably too much).

This is no different than Delong reading a study that the marginal return of a dollar invested by Starbucks and by the hot-dog stand down the street being the same, and concluding that the hot-dog stand should get billions of government subsidies to expand!

One of the better posts I've read here. Reminds me of why I keep coming back. Thanks!!!

For a contrary to the statement that "Borrowers rarely default on their loans." go to Prosper which is more or less an eBay for unsecured loans. If anyone here thinks their judgement is better than average, this is a good place to test it out with real money.

Hopefully you can help me as I'm having some difficulty understanding some points. I'm not an economist, so bear with me.

In your first paragraph, the example implies that halving the default rate will result in twice the return rate, which obviously isn't correct. I assume this correlation was just an accident of the example, but it does bring to mind an interesting point about when return rate multiples make any sense. For if the second guy instead suffered a default rate of 2.9%, he'd get a return of 100.013; i.e. the first lender would have had ~150 times the return rate of the second. But surely it can't be said that the first guy is 150 times better off in this case? Things get even more absurd as the total return of the second lender approaches $100 (which occurs when the default rate is just over 3%), since the first lender becomes *infinitely* better off than the second. So I guess my question is, what is a sensible interpretation of return rate multiples?

A second issue I have difficulty understanding is the analogy of 'return on education' with 'return on loan'. I assume return on education means a percentage increase in the sum over all one's future earnings in real terms (since I'm unsure about this, here's a source). So, if a person would earn 1 million over their life, then, all things being equal, they will earn 1.07 million if they successfully complete an extra year of education. Let us suppose also that the cost of education, again in real terms, for one extra year is $40,000 (to pick a round number). The problem here is that the 7% is calculated on future earnings, not on the cost of education to the student (assuming the student pays :) ). This is contrary to the lending situation, in which the 7% would be calculated on the cost of the loan to the lender. Doesn't this make a major difference in terms of how we interpret everything? In particular, (1.07m - 1m) - 40,000, which gives a modest $30,000 net return (3%), but which is actually a 43% return on the $40,000 cost? (Since everything is in real terms.)

A further issue I have is with interpreting the default rate in education. Suppose this default rate is 3.5%. By this, I assume you mean that 3.5% of people who attempt one extra year do both of the following: 1) fail to complete the extra year successfully, and 2) fail to raise their future income. The first point is actually irrelevant, so I'll only consider the second. In this case, the 'defaulting' occurs not on the *loan* but on the *interest*. So, people who fail still get their 1m in income, and people who succeed get 1.07m in income. But this radically changes the analogy with the example in the first paragraph. That is, the expected value to someone attempting one more year is E(one_more_year) = 1.07m*0.965 + 1m*0.035 = 1.068m. In that case, (1.068m - 1m) - 40,000, still gives a $28,000 net return (2.8%), or a 40% return on the $40,000 in education costs.

Based on this, the default rate in education would have to be such that the cost of 40,000 is greater than E(one_more_year) - E(no_more_years). (Assuming E(no_more_years) does not already include the 40,000 cost.) This gives a required default rate of ~43% or greater. Given we're dealing with counterfactuals, that's still vaguely plausible, but very different to a default rate of 3.5%.

I have a feeling I've missed something, though, so I'd appreciate any corrections to my reasoning.

So given Caplan's position as an educator in realizing this, I imagine two possible reactions:

1) He feels some sense of sympathy for those average students enrolled in your class, for he realizes that he isn't nearly as helpful to them as they would imagine. In fact, he may be the lecurer Adam Smith describes in The Wealth of Nations where he says:

Or for that matter, that he is so unhelpful that he may as well be speaking nonsense.2) He feels that students who would consider taking 300-level (or graduate level) economics classes are rational. These students are aware that the rate of return to education is often overstated. That they are enrolled course would suggest that they are indeed above-average students, and therefore they will receive an above average return to education for his course. He is more helpful than the average professor, and he feel quite good about himself.

Bryan's argument is interesting, but it is not a valid critique of Rouse and Ashenfelter's conclusions, or the literature as a whole. Their study compares identical twins raised in the same family. They find that the twin with less education generally has a higher return on the last year of education than the twin with more education. This is exactly what one would expect if there are diminishing returns to education.

Within an identical twins comparison, it is very unlikely that there is any difference in the probability of dropping out or failing.

You can always make the argument that there must be some unobserved difference which causes the variation in education choices between twins, and that these differences result in different probabilities of dropping out of school or failing, but this is a stretch, and Ashenfelter and Rouse do the necessary work to eliminate this as a likely scenario.

Is graduating from Princeton really that much harder than graduating from GMU?

It's generally easier to graduate from a private college in 4 years than from an equivalent public college. Over 5 or 6 years, the publics start catching up.

Two words: recovery rates.