Hi, I've got what should be a very easy simple linear regression problem, but I can't seem to be able to get my head around it. Here it is: So far I've been trying to sub these values into a regression equation like this one: Y = 5B + (-0.003)B^2 Where "B" is my Beta1 value. I differentiate this then set it equal to zero to find a max value - this comes out to be 833.333, which is a non-sensical value (intuitively, I think the value should be approx 50, but definetly no where near 833). I can't see what I'm doing wrong and my notes on this aren't very good, so any help you can give me is much appreciated. Thanks in advance for any help you can give me.