, 2 min read
Software evolves by natural selection
Software evolves by natural selection, not by intelligent design. It is a massive trial-and-error process. There are many thousands of programmers working every day to build new things in the hope of replacing the old… From time to time, you will hear about a new fantastic piece of computer science. For example, right now deep learning is the hot new thing. Some years ago, people were very excited about MapReduce. As an ecosystem changes, some tools become less likely to be useful while others gain dominance in common use cases. So, because of a myriad of other changes, deep learning is suddenly a lot more useful than it would have been 10 years ago. Deep learning is great to classify pieces of data like images, given that you have the kind of computing power a modern GPU offers. Decades ago, we had much less data, and nothing like a modern GPU, so perceptrons (the ancestors of deep learning) were useless. What does it mean for the programmer? If software advances by trial and error, then the more quickly you can move and adapt, the better are your chances. It also becomes important to anticipate changes and to try many things. Trial-and-error is an underrated approach in software. You will do doubt object that you are using the latest stuff… maybe it is deep learning running on the sexiest big-data framework (e.g., Apache Spark) with the fanciest programming language. So you are set, right? Maybe but be concerned that even a brand new bus turns slowly. Don’t be a bus.
So what is the software of the future? Where should you place your bet? I have been making a boring prediction for more than a decade, and it has served me well: we will have more data. We should think about what could become possible if we were given 10x as much data and 10x as much processing power. What kind of software would we need then?
It is a safe bet that current PC architecture cannot cope. So do not get too attached to it.