The number of new computer science majors today has fallen by half since 2000, according to the Higher Education Research Institute at UCLA. Merrilea Mayo, director of the Government-University-Industry Research Roundtable at the National Academies, says the drop-off was particularly pronounced among women.
Meanwhile, elite schools are reporting that the number of economics majors is exploding. For the 2003–2004 academic year, the number of economics degrees granted by U.S. colleges and universities increased 40 percent from five years previously.
This is not necessarily a bad thing. It could be that in the past, in order to develop a useful computer-based product, you needed to be a computer science major. However, as computers have become easier to use and more generic, you do not need to be a computer science major to develop important applications. Think of Google hiring Hal Varian and a team of economist/quants to try to optimize its marketing by making better use of its data.
Yes, we still need cutting-edge innovation in computer science, but that is a function of really top-flight people, well into the 99th percentile of the distribution of intelligence. College students who are between the 80th and the 99th percentile might do more good using the stuff than attempting to help out on the bleeding edge.
In general, the role of the diffusion and utilization of innovation tends to be under-stressed. In an email, Nick sent me a link to a paper by one of my favorite economists, Amar Bhide. The paper argues that venturesome consumers are an important component of the ecosystem of innovation. He suggests that this means that the United States will tend to benefit the most from innovation, regardless of where the innovation is first developed.
It could be that majoring in economics will turn students into the venturesome consumers that the computer industry needs. It might not need an army of less-than-stellar computer scientists.