Reminds me of a program in Homestuck. It’s code that iterates until the author/universe dies, then executes some unknown code. The coding language is ~ath, or TilDeath.
Not necessarily. I don’t have the numbers in front if me, but there is actually a probability that, past that point, something is so unlikely that you can consider it to be impossible (I.e. will never happen within the lifetime of the universe)
I’m not sure there’s any guarantee that it will ever be sorted, since bit flips will be random and are just as likely to put it more out of order than more in order. Plus if there’s any error correction going on, it can cancel out bit flips entirely until up to a certain threshold.
Though I’m not sure if ECC (and other methods) write the corrected value back to memory or just correct the signals going to the core, so it’s possible they could still add up over time and overcome the second objection.
ECC (and other methods) write the corrected value back to memory
That was my understanding (it corrects the error and writes the good value back to RAM), but now I’m not so sure! I imagine it must do that, otherwise a second bit flip would actually corrupt the RAM, and the RAM manufacturer would want to reduce that risk.
Regular ECC adds an extra parity bit for each byte. For each byte of memory, it can correct an error in one bit, and detect but not correct an error in two bits, so they wouldn’t want a one bit error to linger for longer than it needs to.
The most beautiful thing about this program is that it would work.
Various bit flips will once lead to all numbers being in the correct order. No guarantee the numbers will be the same, though…
Those bitflips are probably more likely to skip the section erroneously than waiting for the array to be sorted.
Fair enough! But won’t they flip again to start the program?
The OS would crash entirely before that happens
Reminds me of a program in Homestuck. It’s code that iterates until the author/universe dies, then executes some unknown code. The coding language is ~ath, or TilDeath.
Not necessarily. I don’t have the numbers in front if me, but there is actually a probability that, past that point, something is so unlikely that you can consider it to be impossible (I.e. will never happen within the lifetime of the universe)
Yet… The chance is never zero 😁
screw the universe we be flippin’ 😎🏄♀️
I’m not sure there’s any guarantee that it will ever be sorted, since bit flips will be random and are just as likely to put it more out of order than more in order. Plus if there’s any error correction going on, it can cancel out bit flips entirely until up to a certain threshold.
Though I’m not sure if ECC (and other methods) write the corrected value back to memory or just correct the signals going to the core, so it’s possible they could still add up over time and overcome the second objection.
That was my understanding (it corrects the error and writes the good value back to RAM), but now I’m not so sure! I imagine it must do that, otherwise a second bit flip would actually corrupt the RAM, and the RAM manufacturer would want to reduce that risk.
Regular ECC adds an extra parity bit for each byte. For each byte of memory, it can correct an error in one bit, and detect but not correct an error in two bits, so they wouldn’t want a one bit error to linger for longer than it needs to.