Computer could retain any information as long as its user desires

AI Thread Summary
A computer can retain information as long as its user desires, but storage media have finite lifespans, often estimated to be around 100 years. Predictions about data retention are based on statistical models that consider the physical properties of the storage medium, rather than long-term testing. The failure rates of storage devices are calculated using complex statistics, which account for variables like age and usage. High-quality manufacturing processes ensure low failure rates, but devices may still have inherent defects that affect longevity. Ultimately, while data can be stored for extended periods, it is not guaranteed to last indefinitely.
jackson6612
Messages
334
Reaction score
1
Hi

I have read that a computer could retain any information as long as its user desires or unless it's nor broken. On many hard disks and USB Flash drives I have seen that the duration of how long the drives could retain the data safely is mentioned, let's say 100 years. What does it mean? To me, that would simply mean a computer cannot store information forever. Please guide me. Thanks.
 
Engineering news on Phys.org
jackson6612,

The first thing to understand is the concept of "forever". "Forever" implies "All of Time", and "From now until the end of time". While we can roughly date the beginning of cosmological time as occurring 13.75 billion years ago, predicting the end of time is a far more debated topic. It is safe to assume no man-made device is older than humanity, and it is safe to assume that all computer storage media are man-made. So, what we are left with is man-made storage media that are all less than 100 years old. Most are less than 40 years old.

The scientist and engineers that develop computer storage media obviously do not have time to test the longevity of the media by direct observation (that is, it is not practical to develop an idea, build it and then wait XXX years to test its longevity), so, they use science in an attempt to predict the life span of some percentage of a batch based on their understanding of the physical properties.

This notion is similar to predicting the average age of a population at some point in the future based on the current population and age distribution. The mathematical model uses statistics guided by known factors to generate a prediction. If the model does not take into account a pandemic, and a pandemic occurs, the prediction could be way off. The further into the future the prediction attempts to calculate, the less accurate the prediction is likely to be.

With computer storage the longevity predictions are a bit more straight-forward, but there is still a lot of room for error. If you assume an 8GB flash drive contains 9 * 2^33 cells for storage alone and a fair number of additional cells for addressing, error checking and interfacing, and you consider that most cells share common characteristics with the other cells on the same die then on a single die, if one cell fails in 1 year of use then the failure rate for the cells is 1 cell out of ~100 Billion per year. If you produce 50 million dies using this architecture, and 10 dies exhibit a cell failure in one year, then your failure rate drops to 100B * 50M/10 per year, and the chances of any particular die failing in 1 year is 5 million to 1. An acceptable failure rate might be .1%, so, assuming the failures continue at the rate of 10 per year for our 50 million units, 50M * .001 = 500,000 units. 500,000 units @ 10 per year would suggest a lifetime of 50,000 years, with a failure rate of .1%.

The actual statistics for projecting useful life of a component are more complex and take into account increased failures with age, use and other variables, but that should give you some idea of how the process works. The silicone chip mfg process has incredibly high quality control. There is no other industry in the world that has a failure level as low per unit. The fact that a single chip may contain billions of cells that all function exactly as they are suppose to is a testament to engineering at its best.

To further improve the life span of things like FLASH drives and Magnetic Drives, additional internal circuitry is employed to detect failures and simply avoid those memory blocks. To lower costs, "New" memory devices frequently have "Bad Blocks" or "Bad Sectors" straight from the factory but are sold as "first quality" if the number of bad sectors/blocks are less than a specified percentage of the total capacity. The price for "perfect" storage media is considerably higher than that of "first quality" media. Military, Aero-Space, Medical and other Government Spec devices typically cost orders of magnitude more than consumer quality devices because they require strict testing and documentation of each device.

Anyway, hope that helps a bit.

Fish
 
Thread 'Weird near-field phenomenon I get in my EM simulation'
I recently made a basic simulation of wire antennas and I am not sure if the near field in my simulation is modeled correctly. One of the things that worry me is the fact that sometimes I see in my simulation "movements" in the near field that seems to be faster than the speed of wave propagation I defined (the speed of light in the simulation). Specifically I see "nodes" of low amplitude in the E field that are quickly "emitted" from the antenna and then slow down as they approach the far...
Hello dear reader, a brief introduction: Some 4 years ago someone started developing health related issues, apparently due to exposure to RF & ELF related frequencies and/or fields (Magnetic). This is currently becoming known as EHS. (Electromagnetic hypersensitivity is a claimed sensitivity to electromagnetic fields, to which adverse symptoms are attributed.) She experiences a deep burning sensation throughout her entire body, leaving her in pain and exhausted after a pulse has occurred...
Back
Top