, 1 min read
The fundamental properties of computing
Physics works with fundamental properties such as mass, speed, acceleration, energy, and so on. Quantum mechanics has a well known trade-off between position and momentum: you can know where I am, or how fast I am going, but not both at the same time.
Algorithms (and their implementations) also have fundamental properties. Running time and memory usage are the obvious ones. In practice, there is often a trade-off between memory usage and the running time: you can a low memory usage, or a short running time, but not both. Michael Mitzenmacher reminded me this morning of another: correctness. On some difficult problems, you can get a low memory usage and a short running time if you accept an approximate solution.
I believe there are other fundamental properties like latency. Consider problems where the volume of the solution and of the input is large: statistics, image processing, finding some subgraph or sublist, text compression, and so on. In such instances, the solution comes out as a stream. You can measure the delay between the input and the output. For example, a program that compresses text by first scanning the whole text might have high latency, even if the overall running time is not large. Similarly, we can give the illusion that a Web browser is faster by beginning the Web page rendering faster, even if the overall running time of the rendering is the same. As another example, I once wrote a paper on computing the running maximum/minimum of an array where latency was an issue.
It would be interesting to come up with a listing of all the fundamental properties of computing.