While reading through some operating system technical recommendations I came across this:
If the probability of an error is one in a hundred quadrillion, and if
the memory system is running at 10 MHz (100 nanosecond), and if you have
125 Megabytes of RAM (1 billion bits), then you would expect on average
to see one single-bit error every ten seconds and one double-bit error
every thousand quadrillion seconds (somewhat more than the age of the
universe). That is why ECC memory is worth using, and why it is
designed to detect but not correct double-bit errors.
I'm not sure this is correct, but it is rather interesting. The rest of the page is here:
http://www.ohio.edu/people/piccard/mis300/eccram.htm
(Also note the date at the bottom!)
No comments:
Post a Comment
Please comment, I like feedback!