Daniel Lemire's blog

, 2 min read

The big scientific questions in computing have been answered

The Economist tell us about the downfall of Vannevar Bush’s research model: ideas are conceived in universities and then passed on to industry for commercialization. I honestly do not know if this model was ever true. I do not think that academic research has ever been about producing ideas that industry can commercialize.

Many people are quite happy to consider that academic research is meant to address longer term issues whereas industry must tackle immediate needs, but as the article points out, there are major flaws in this theory. Most industry researchers are part of solid teams where people all have (relatively) long lasting jobs. Meanwhile, university researchers are often surrounded by students who come and go. In universities, the researchers often must keep seeking funding, whereas, in industry laboratories, funding is the manager’s problem. Longer term thinking? Let us look at most university researchers publication lists: often we see lots of short papers, lots of short time goals. Maybe these all fit together into a long term plan, maybe they don’t. Meanwhile, increasingly, industry researchers must go from the fundamental ideas all the way to the product, a really long cycle in comparison.

It would be more honest to say that industry researchers have to try to deliver commercial products whereas academic researchers do not, rather than to assume that one is either complimentary to the other. I do not think that industry researchers spend much time reading academic papers. They are too busy learning about how their ideas can be taken to market. Even if you have the best laboratory in the world, and you have Claude Shannon, and Hamming and so on, working for you, in 2007… it might not help you that much. The Nortel of the world have their software written in India and their hardware components designed in China.

What about Computer Science? The article is quite pessimistic about academic research:

In Bush’s time the science that went into computing was itself closer to basic research. By contrast, many of the big scientific questions in computing have been answered—at least well enough for companies to find that innovation emerges from new ways of arranging today’s technologies rather than inventing new ones.

Fine. But we have to be careful about what this means. It only means that given our industrial needs and abilities, building a new generation of tools is not worth it. We prefer to build iPods out of existing parts. It is simply too expensive at this point, given what the potential gains might be, to build drastically new concepts about computing. But it could all change suddenly.