One of the worst problems with conventional return to education estimates is that they ignore drop-outs. That’s like a bank ignoring defaults when it calculates its return on loans. According to a recent experiment, a lot of parents ignore drop-out risk, too. From Andrew Kelly and Mark Schneider:
We asked a representative sample of one thousand parents of
high-school-age children in five states to choose between two public
colleges in their state based on their own judgments and information we
provided to them. Respondents were randomly assigned to a treatment or
control group. Treated respondents received the same set of basic facts
as the control group as well as information about each school’s
six-year graduation rate…Overall,
we found that providing graduation-rate information increased the
probability that parents would choose the institution with the higher
graduation rate by about 15 percentage points. [emphasis mine]
Particularly striking: Kelly and Schneider provide indirect support for a point Beaulier and I made about preferential admissions (a.k.a. affirmative action) in “Behavioral Economics and Perverse Effects of the Welfare State.” Us:
Could giving minority students more choices make them worse
off? It could if they are
unrealistically optimistic about their probability of success, leading them to
choose an opportunity beyond their capabilities. Self-serving bias might also incline each
student to assume that he was admitted on his own merits: “If I were being
admitted because of affirmative action, I should be worried. But unlike many other students, I was accepted on my merits.”
Consistent with this worry, Kelly and Schneider find that less-educated parents are much more influenced by the absence of information about the graduation rate. Parents with no college were 23 percentage-points more likely to choose the more selective school when they weren’t told the graduation rate. The information effect for parents with college degrees was only 7 percentage points.
If you wonder, “How much does this have to do with real-world college selection?,” Kelly and Schneider have a pretty good answer:
[B]ecause respondents were asked about
real public colleges and universities in their region and received true
information about school characteristics, the experiments discussed
below have considerable external validity.
Makes sense to me.
READER COMMENTS
Daniel Kuehn
Feb 8 2011 at 11:39am
Could you clarify something from the earlier post that you linked to on the returns to education studies?
So the problem you raise – whether an additional year is really a full year of schooling – seems to me to just introduce some fuzziness and error in your measure of schooling, right?
That’s obviously not good, but I don’t see how it irreparably harms the estimate. So some people who are counted as getting 11 years of school actually get 11 and some actually get only 10.1 years of schooling. OK. That’s not great but the estimates should still be useful. If you include discontinuities for receipt of an actual degree (regardless of years of schooling), that should clean up a lot of the problems, I would think.
Is this really as bad as you make it out to be? It seems to me that the potential endogeneity biases in the estimates are considerably more substantial than the measurement errors that you point out here. If I had to prioritize problems with the estimates, that’s where I’d look.
gabriel rossman
Feb 8 2011 at 12:59pm
>Could giving minority students more choices make them worse
>off? It could if they are unrealistically optimistic about their
>probability of success, leading them to choose an opportunity
>beyond their capabilities.
My reading of the evidence on the “mismatch” hypothesis is that its relevance/validity is largely a function of the drop-out rate.
Doug Massey has shown that on the margin between moderately selective and highly selective undergrad schools, the mismatch hypothesis is completely swamped by the much higher completion rates at highly selective schools. In other words, Professor Caplan’s speculation in the 2/2/07 post that a student who gets into GMU+Princeton should choose GMU on the “big fish in a small pond” principle is understandable but wrong. In fact, anyone who is admitted to Princeton (even under partially non-academic criteria) is better off going there than GMU (or UCLA) because the baseline odds-ratio of completing Princeton vs GMU is about 6:1, and it doesn’t go down to anywhere close to parity even when you control for the micro.
On the other hand, Richard Sander has shown that with law school there is some support for mismatch, mostly because of the role that class rank (or its close correlates) plays in the legal job market.
The less sexy but in some ways more important question that AFAIK nobody has addressed empirically is the margin between non-attendance and low selectivity schools (which tend to have extraordinarily high attrition).
Steve Sailer
Feb 8 2011 at 4:30pm
There are a few well-known colleges that are notoriously hard to graduate from — Cal Tech, Reed, U. of Illinois — but most famous colleges are much more likely to be notoriously easy to graduate from: e.g., Stanford.
Certain majors can be hard to graduate from at Stanford — I know a smart man who was semi-involuntarily moved from physics to electrical engineering, and I’m sure lots of people who started out as S/Es wound up in sociology. But it’s really easy to successfully major in something at Stanford.
Mr. Econotarian
Feb 8 2011 at 7:05pm
The problem is that the relevant statistic is not “percentage drop-out” but “my chance of dropping out”.
If you have rich parents footing the bill, and a straight-A student, and taking a slightly less hard major, your chance of graduating might be high.
If you have to work your way through school, closer to a B student, and taking a higher major, perhaps your chance of graduating might not be high (research would need to be done on this).
There also is a price issue. At the University of Maryland (many years ago when it was cheap), failing some classes and taking an extra year or two to graduate didn’t seem so bad since it was not expensive. The problem is that many students who missed graduating in four years because of academic problems would eventually give up after year 5,6,7.
At expensive schools, no one takes an extra year to graduate because it is insanely expensive. The option is never open. It is “do or die”.
Noah Yetter
Feb 8 2011 at 7:45pm
gabriel rossman said:
In fact, anyone who is admitted to Princeton (even under partially non-academic criteria) is better off going there than GMU (or UCLA) because the baseline odds-ratio of completing Princeton vs GMU is about 6:1…
This is a profoundly wrong-headed way for a student to think about the question. Were we behind a Rawlsian Veil of Ignorance, sure, this probability would be a highly valuable data point. But we are not. As Mr. Econotarian elucidates above, students have information about themselves, information whose predictive power completely swamps the naive “odds of completion” figure. In simpler language, the overall graduation rate isn’t terribly relevant or useful to me, the individual student.
lemmy caution
Feb 9 2011 at 4:47pm
“students have information about themselves, information whose predictive power completely swamps the naive “odds of completion” figure. In simpler language, the overall graduation rate isn’t terribly relevant or useful to me, the individual student.”
Lets say that 5% of NCAA division I basketball players go to the NBA. If a high school kid tells you that he knows he will go to the NBA because of “information about themselves, information whose predictive power completely swamps the naive “odds of entering the NBA” figure” wouldn’t he likely be wrong. The kid knows that he works hard and has talent, but everyone playing NCAA division I basketball is a good basketball player; they are still unlikely to go to the NBA.
Glen
Feb 9 2011 at 7:40pm
Bryan has a great point, but it could be misunderstood as supporting the fetishizing of graduation rates. As someone who works at a university with a high drop-out rate, I can tell you the administration is intensely interested in raising graduation rates. But that goal tends to result in pressure on tough-grading departments to lower their standards.
The pressure to lower standards is never explicit, of course. The idea is that we should adopt teaching strategies that could help at-risk students stand a better chance of passing. But those teaching strategies tend to be more time-intensive, thereby creating an incentive for instructors to simply lower standards instead. Also, these strategies often involve spending more time on each topic, thereby reducing the breadth of curriculum and hence lowering standards in a different way.
Comments are closed.