enfield
- 20
- 0
y_i=A{x_i}^b
When I solve for A two different ways I am getting different answers..so somewhere I'm doing something wrong. If someone could point out where I would be grateful :).
Using logs:
y_i=A{x_i}^b
ln(y_i)=ln(A)+b*ln(x_i)
ln(y_i)-(ln(A)+b*ln(x_i))=r_i for least squares we want to minimize: S=\sum_i^{n}{r_i}^2 which means the gradient has to be zero. I only care about finding A right now so I only have to deal with the partial with respect to A:
\frac{\partial S}{\partial A}\sum_i^{n}{r_i}^2 =2 \sum_i^{n}{r_i}\frac{\partial r_i}{\partial A}= 2\sum_i^{n}(ln(y_i)-ln(A)-b*ln(x_i))\frac{1}{A}=0
The numerator inside the sum has to be zero, and we can ignore the 2, so:
\sum_i^{n}(ln(y_i)-ln(A)-b*ln(x_i))=0
\sum_i^{n}ln(y_i)-n*ln(A)-b\sum_i^{n}ln(x_i)=0
ln(A)=\frac{\sum_i^{n}ln(y_i)-b\sum_i^{n}ln(x_i)}{n}
(this is the derivation that I think is correct).
But when i solve for A without taking the logs of each side first I get something else:
y_i - A{x_i}^b = r_i\frac{\partial S}{\partial A}\sum_i^{n}{r_i}^2 =2 \sum_i^{n}{r_i}\frac{\partial r_i}{\partial A}= 2\sum_i^{n}(y_i - A{x_i}^b)*-{x_i}^b=0
\sum_i^{n}(-{x_i}{y_i} + A{x_i}^{2b})=0
-\sum_i^{n}{x_i}{y_i}+A\sum_i^{n}{x_i}^{2b}=0
A=\frac{\sum_i^{n}{x_i}{y_i}}{\sum_i^{n}{x_i}^{2b}}
And if you take the ln of it to compare it with what we got before you get:
ln(A)= ln(\frac{\sum_i^{n}{x_i}{y_i}}{\sum_i^{n}{x_i}^{2b}}) =ln(\sum_i^{n}{x_i}{y_i})-ln(\sum_i^{n}{x_i}^{2b})
Which is not the same as:
ln(A)=\frac{\sum_i^{n}ln(y_i)-b\sum_i^{n}ln(x_i)}{n}
as far as I can tell...
When I solve for A two different ways I am getting different answers..so somewhere I'm doing something wrong. If someone could point out where I would be grateful :).
Using logs:
y_i=A{x_i}^b
ln(y_i)=ln(A)+b*ln(x_i)
ln(y_i)-(ln(A)+b*ln(x_i))=r_i for least squares we want to minimize: S=\sum_i^{n}{r_i}^2 which means the gradient has to be zero. I only care about finding A right now so I only have to deal with the partial with respect to A:
\frac{\partial S}{\partial A}\sum_i^{n}{r_i}^2 =2 \sum_i^{n}{r_i}\frac{\partial r_i}{\partial A}= 2\sum_i^{n}(ln(y_i)-ln(A)-b*ln(x_i))\frac{1}{A}=0
The numerator inside the sum has to be zero, and we can ignore the 2, so:
\sum_i^{n}(ln(y_i)-ln(A)-b*ln(x_i))=0
\sum_i^{n}ln(y_i)-n*ln(A)-b\sum_i^{n}ln(x_i)=0
ln(A)=\frac{\sum_i^{n}ln(y_i)-b\sum_i^{n}ln(x_i)}{n}
(this is the derivation that I think is correct).
But when i solve for A without taking the logs of each side first I get something else:
y_i - A{x_i}^b = r_i\frac{\partial S}{\partial A}\sum_i^{n}{r_i}^2 =2 \sum_i^{n}{r_i}\frac{\partial r_i}{\partial A}= 2\sum_i^{n}(y_i - A{x_i}^b)*-{x_i}^b=0
\sum_i^{n}(-{x_i}{y_i} + A{x_i}^{2b})=0
-\sum_i^{n}{x_i}{y_i}+A\sum_i^{n}{x_i}^{2b}=0
A=\frac{\sum_i^{n}{x_i}{y_i}}{\sum_i^{n}{x_i}^{2b}}
And if you take the ln of it to compare it with what we got before you get:
ln(A)= ln(\frac{\sum_i^{n}{x_i}{y_i}}{\sum_i^{n}{x_i}^{2b}}) =ln(\sum_i^{n}{x_i}{y_i})-ln(\sum_i^{n}{x_i}^{2b})
Which is not the same as:
ln(A)=\frac{\sum_i^{n}ln(y_i)-b\sum_i^{n}ln(x_i)}{n}
as far as I can tell...