Daniel Lemire's blog

, 5 min read

The shopper’s dilemma: wait for new technology or buy now?

6 thoughts on “The shopper’s dilemma: wait for new technology or buy now?”

  1. IraqiGeek says:

    Whenever there is interest, there is also a “hurdle rate.” That is, how much benefit (whether material or otherwise) does an individual derive from using a device the given period of time until the next one?

    Some might simply derive less benefit (real, or perceived) from owning the device the given period of time. Others would get more benefit (the pleasure of owning the latest-greatest is a form of benefit) from owning a certain technology for the same period.

    In most cases I think there is a point at which (for a given technology) further development will yield marginally better improvement/experience from the user’s perspective. Just as an example, the difference in image quality of LCDs from 10 years ago compared to today isn’t as big as when compared to those from the late 90s. Ditto when comparing computers from the 90s, to those from the late 00s, to those on sale today, from the perspective of the average user (browsing, email, media consumption, and the like).

    Still, I think it would be very fascinating if anyone could objectively quantify this “hurdle rate” for any device category or technology.

  2. Peter Turney says:

    The newest thing is also the most expensive. If you balance cost with value, it might be best to always buy last year’s model. Furthermore, the newest thing might have flaws that haven’t yet been exposed. That’s why we have the phrase “the bleeding edge”.


    1. Markus Schaber says:

      That’s exactly my tactic. Buying last years technology much cheaper brings me still enough benefit for all my uses, for less money, and still the joy of a new phone every 1 or 2 years.

  3. Peter Capek says:

    This effect was observed a few years ago with respect to Moore’s law and improving speed and cost of computation. Should I buy a computer to do my computation today, or wait 6 months and buy a faster computer. I recall a paper which included the word “slacking” in the title: what’s the value of the delay before starting the computation.

  4. degski says:

    Option 2 is rubbish. With that perspective, no-one [tech people] would ever buy anything. It’s the same reason why a [central bank’s] target of 2% inflation is rubbish, it’s only there to take your money, legalized theft.

    Luckily this [the posted conundrum] dilemma will dis-appear. We will dis-appear as we consume [and procreate] ourselves into oblivion. We consume here and pollute there and wealth extraction never ceases. The real interest is there, the interest our [your] children will pay for today’s consumerism and the Neo-liberal religion.

    Some evening reading: https://www.theguardian.com/commentisfree/2019/apr/25/capitalism-economic-system-survival-earth

  5. Oren Tirosh says:

    Such predictions a based on assuming rational behavior and may not be s worth much for something as emotionally charged as consumer behavior.

    If you had a big computational task to complete by a certain deadline, cloud computing did not yet exist and Moore’s law was still on track you could actually finish sooner and save money by waiting a while before you satrt so you can buy faster hardware.

    This was the case for the task of rendering the CGI-heavy Lord of the Rings movies.