By extrapolation, within 20 years, there will be cheap computers with more than 700 TB of RAM. So, not only will you able to store the data, you’ll also be able to process it very quickly.
Mugizisays:
This is a great point that I have often thought about:
Soon it will be possible(and arguably it already is) to store DETAILED INFORMATION about every human being alive on an average computer.
Lets say 7 billion people and 100,000 bytes for biographical data and employment history and a small picture, that comes to 700 Terabytes. If you just look at Americans (300 million) its just 30 Terabytes. Very feasible.
What are the implications going to be when storing and accessing such vast personal information is much easier than it is now? What are governments and corporations going to do with it?
@Daniel, you know that the “processing it quickly” part is the rub. Fact is, processors and memory bandwidth is not getting faster at the same pace that memory is growing. So, memory will become free and all our money will go into backplanes, CPUs and support chips, and power supplies.
And nobody knows how to effectively use large numbers of cores for problems that aren’t trivially parallelizable, at least not without months or years of work on each individual problem.
It does look like Google can keep indexing a good fraction of the Web and keep it in RAM. So some important part of the Web is sapien-bound(ed).
It would be very interesting to do a more scholarly analysis on what can be considered sapien-bound(ed)… with hard numbers an so on… Are you interested?
I share your concerns regarding our limitations, but we have many good years to go. Having 1024 cores per main CPU will certainly help… though you have to worry about heat and power.
@Mugizi
By extrapolation, within 20 years, there will be cheap computers with more than 700 TB of RAM. So, not only will you able to store the data, you’ll also be able to process it very quickly.
This is a great point that I have often thought about:
Soon it will be possible(and arguably it already is) to store DETAILED INFORMATION about every human being alive on an average computer.
Lets say 7 billion people and 100,000 bytes for biographical data and employment history and a small picture, that comes to 700 Terabytes. If you just look at Americans (300 million) its just 30 Terabytes. Very feasible.
What are the implications going to be when storing and accessing such vast personal information is much easier than it is now? What are governments and corporations going to do with it?
I guess we’re going to find out.
@Daniel, you know that the “processing it quickly” part is the rub. Fact is, processors and memory bandwidth is not getting faster at the same pace that memory is growing. So, memory will become free and all our money will go into backplanes, CPUs and support chips, and power supplies.
And nobody knows how to effectively use large numbers of cores for problems that aren’t trivially parallelizable, at least not without months or years of work on each individual problem.
Question: is the web a sapien-bound system?
Spam question: is très + sex = good thing?
@Mike
It does look like Google can keep indexing a good fraction of the Web and keep it in RAM. So some important part of the Web is sapien-bound(ed).
It would be very interesting to do a more scholarly analysis on what can be considered sapien-bound(ed)… with hard numbers an so on… Are you interested?
I share your concerns regarding our limitations, but we have many good years to go. Having 1024 cores per main CPU will certainly help… though you have to worry about heat and power.