orangeIV
- 1
- 0
Hi, I'm currently trying to work through a problem about calculating the most likely time for a hard disk to fail:
Hard disks fail with a probability per unit time: \alpha (t) = \alpha _0 t where \alpha_0 = 0.5 years.
I know that the answer is t_{modal} = \frac{1}{\sqrt{\alpha_0}}, but am having problems deriving this. Here is what I've done so far:
The probability distribution can be calculated as follows:
f(x) = \alpha (t) e^{-\int \alpha (t) dt} = \alpha (t) e^{-\frac{1}{2} \alpha_0 t^2}
The most likely time for the disk to fail will be when \frac{df}{dt} = 0. So when
0 = -{\alpha_0}^2 t^2 e^{-\frac{1}{2} \alpha_0 t^2}
This is where I get stuck. Is this the correct approach? Any ideas about how how I might proceed :)
Thanks
Hard disks fail with a probability per unit time: \alpha (t) = \alpha _0 t where \alpha_0 = 0.5 years.
I know that the answer is t_{modal} = \frac{1}{\sqrt{\alpha_0}}, but am having problems deriving this. Here is what I've done so far:
The probability distribution can be calculated as follows:
f(x) = \alpha (t) e^{-\int \alpha (t) dt} = \alpha (t) e^{-\frac{1}{2} \alpha_0 t^2}
The most likely time for the disk to fail will be when \frac{df}{dt} = 0. So when
0 = -{\alpha_0}^2 t^2 e^{-\frac{1}{2} \alpha_0 t^2}
This is where I get stuck. Is this the correct approach? Any ideas about how how I might proceed :)
Thanks