Borrowers rarely default on their loans. Nevertheless, differences in default rates have huge effect on rates of return. Suppose, for example, that two lenders charge 3% interest, but one has a default rate of 1% and the other has a default rate of 2%. The first lender has twice the rate of return of the second. After all, when the first guy lends out $100, he gets back .99*$103=$101.97, while the second only gets back .98*$103=$100.94.

When labor economists calculate the return to education, however, an analog of default is strangely missing. How is that possible? Simple: Labor economists normally measure the effect on earnings of successfully completed years of schooling. In other words, they assume the best-case scenario where every educational investment concluded with a year’s worth of passing grades.

In reality, though, people often enroll for a year of school, attend for a while, and then give up. And sometimes they attend for a full year, but get failing grades. Either way, they spend most or all of the cost of a year of education (including foregone earnings), with little or no benefit. It’s analogous to defaulting on a loan – you spend the resources, but don’t get the return.

The omission matters. If there is a 7% return to successfully completed education, but a 3.5% default rate, the expected rate of return is 3.5% – half as big as the naive estimate.

An important corollary is that, properly measured, the higher a student’s probability of dropping out or failing, the lower his return to education. Many labor economists make a big deal about the finding that marginal students seem to have a normal rate of return. Brad DeLong writes, for example:

Ceci Rouse and Orley Ashenfelter of Princeton University report that they find no signs that those who receive little education do so because education does not pay off for them: If anything, the returns to an extra year of schooling appear greater for those who get little education than for those who get a lot.

The problem is that these calculations usually assume that marginal students are certain to finish what they start. If they are unusually likely to give up or fail out, then their properly measured return to education plummets. And there’s every reason to think that marginal students are more likely to give up or fail out.

Thus, imagine we admit an average college grad to MIT’s school of engineering for a year. IF he actually finishes, it’s easy to believe that he’ll earn a normal payoff for giving up a year of his life. The catch is that it would be a miracle if an average college grad could actually finish a year at MIT.

On some level, I suspect that most labor economists would grant my point. If an average high school student got into both Princeton and GMU, it’s hardly clear that he’s better off going to Princeton. Yes, he’ll earn more money if he finishes at Princeton, but he probably won’t finish because the competition is too intense.

Once you buy this point, however, it also follows that if an average high school student could either work or go to GMU, he might be better off working. Yes, he’ll earn more money if he finishes at GMU, but there is a good chance that he’ll invest a lot of time and money, then give up or fail out.