Daniel Lemire's blog

, 1 min read

If you claim high scalability…

I just reviewed a paper where the authors come up with a nice highly scalable algorithm. And it is really scalable too! But to prove just how fast it is, they process 2,000 data points.

This is correct, strictly speaking. Their algorithm runs in O(n) time, so to know how long it would take to process 1000 times more data, just multiply by 1000.

But where is the fun in that?