It's been a long time since I had to open up R and do some calculations, so my memory is a little bit rusty in statistical modeling. But I was just watching a review of the new Silent Hill HD Collection, that is ostensibly an HD remake of two older Silent Hill games. Of course this is but one HD remake made to capitalize on the lack of backwards compatibility. The reviews of these HD releases seem to focus, or at least never forget to talk about, the comparisons of the frame rates. This reviewer http://www.eurogamer.net/articles/digitalfoundry-what-went-wrong-with-silent-hill-hd even has a cool little graph of fps/time. I don't really play video games all that often to really know how the frame rate translates into the experience of the game. Assuming its doesn't drop too low,obviously. But it seems like the FPS comparison is pretty important to some people. So, I just started wondering how you would statistically test the frame rates. All I can really think of is a basic parametric comparison of means assuming that FPS is a random variable with say a normal distribution (X being the rv for the HD version and Y the original version) at the usual level of significance, so that your test statistic is the usual t-distributed t*=(ƩX-ƩY)-(μX-μY) / Sp Where Sp is the pooled sample variance. But I don't know, I'm not very good at math, maybe I'll just have to take a look at my old stats book now to refresh my horrid memory.