# Linear regression problem

1. Feb 6, 2009

### oleandora

1. The problem statement, all variables and given/known data

I've been given a set of data
x 0 0.5 0.7 1.5 1.75
y 0.5 0.72 0.51 1.5 1.63

Given y=ax+b

for this data points of linear model, I have to
1. minimize the sum of the absolute values of deviations between experimental value of Y and value predicted by the linear relation
2. minimiza the maximum value of deviation between all experimental values of Y and value predicted by linear relation

The question is
What are the optimal values of a and b for both cases and values of objective function.

2. Relevant equations

3. The attempt at a solution
What I know is
y=ax+b+e where e is the deviation but thats it. I don't even have the slightest idea for the next steps.
I know seems simplebut I have very little background on linear regression.
Help! T_T

2. Feb 7, 2009

### MaxManus

Hey, this is the first time I tried to help someone on this forum, so don't shoot me if I am wronge. If we are at the same level I guess you are supposed to use OLS Ordinary Least Square.

b = $$\sum$$ (Xi-$$\bar{X}$$)(Yi-$$\bar{Y}$$)/
$$\sum$$ (Xi-$$\bar{X}$$)2

a = $$\bar{Y}$$ - b$$\bar{X}$$

where you sum from 1 to n.
$$\bar{X}$$ means the average