- #1
- 7,006
- 10,448
Hi All,
I am thinking of the issue of diminishing returns re linear regression. Can it be determined/decided from the
data itself, or is it decided just from the context? I was thinking of examples like that of grade vs daily study hours or (height )jump length vs year ( winner heights have been increasing.) In the 1st case, say the slope is 0.5 , constant is 23 ,so that every hour studied adds (along the regression line) a half point to the grade . It seems clear that studying 18 hours n a day would not add 9 points, i.e., we hit a diminishing returns at some point. Still, can this diminishing return be deduced from the data itself, or just from common sense/context?
Thanks.
I am thinking of the issue of diminishing returns re linear regression. Can it be determined/decided from the
data itself, or is it decided just from the context? I was thinking of examples like that of grade vs daily study hours or (height )jump length vs year ( winner heights have been increasing.) In the 1st case, say the slope is 0.5 , constant is 23 ,so that every hour studied adds (along the regression line) a half point to the grade . It seems clear that studying 18 hours n a day would not add 9 points, i.e., we hit a diminishing returns at some point. Still, can this diminishing return be deduced from the data itself, or just from common sense/context?
Thanks.