Weighted Least Squares for coefficients

In summary: Can you provide an example of a problem you are trying to solve that would necessitate weighting coefficients?In summary, I am looking for a solution of putting weights on the coefficient vector. Pictorially speaking, some values in my coefficient vector c are more important than others and I would like emphasize some more than others using my limited set of N measurements.
  • #1
divB
87
0
Hi,

I have an ordinary least squares setup y = Ac where A is an NxM (N>>M) matrix, c the unknown coefficients and y the measurements.

Now WEIGHTED least squares allows to weight the MEASUREMENTS if, for example, some measurements are more important or contain a lower variance.

However, I am looking for a solution of putting weights on the coefficient vector. Pictorially speaking, some values in my coefficient vector c are more important than others and I would like emphasize some more than others using my limited set of N measurements.

Just adding a diagonal weighting matrix w as follows does not work:

[tex]
\min_c \| y - Awc \|_2
[/tex]
 
Physics news on Phys.org
  • #2
How can some unknown coefficients be "more important than others"? I don't think that is a meaningful concept.
 
  • #3
But there are cases where it makes sense. Not everything is a linear system. In my case I clearly see that perturbing coefficients with the same noise gives different results, depending on which I perturb. So some are more important than others.

It's difficult to explain but I tried to explain the setup already some time ago in a different context (https://www.physicsforums.com/showpost.php?p=4533702&postcount=7 ff.).
 
  • #4
divB said:
Not everything is a linear system..
You need to explain why you are asking about the system of linears equations given by y = Ac. I suggest you explain the problem you are trying to solve.
 
  • #5
Ok, it is a Volterra series. So it is linear in its coefficients but a non-linear system. But I think it does not matter. Anyway, since for a practical system the coefficients decay very fast, lower-order terms dominate the total error. But if you are interested in certain non-linear behavior you want to put more emphasis on those terms.
 
  • #6
divB said:
In my case I clearly see that perturbing coefficients with the same noise gives different results, depending on which I perturb. So some are more important than others.
So you are talking about correlations?
Sure, calculate the correlation matrix for your parameters.
 
  • #7
Am I? I am not sure ... at least I do not see how.

Can you provide me with an idea how this relates to the described setup and how to obtain the correlation matrix then?

The one thing I know is that for my application, some coefficients are more important for me - I do not know how they are cross-correlated with themselves! Therefore I only know the structure but generally I do not know their statistics (although it could be useful to leverage this too).

Indenpendently from that, how would I add this information to solve for the coefficients? Does it still work for Least Squares? Are you referring to MAP estimation?
 
  • #8
It's hard to understand the problem with pieces of information scattered across multiple posts like this.

some coefficients are more important for me
TThe fit does not care (and does not have to care) which parameters are more interesting for you. It gives you a description which parameters work best, and their corresponding uncertainties and correlations.
 
  • #9
TThe fit does not care (and does not have to care) which parameters are more interesting for you.

And how could I make it care?
 
  • #10
The fit must not and cannot care about that. This is then your interpretation of the fit result.
 

1. What is Weighted Least Squares (WLS) for coefficients?

Weighted Least Squares is a statistical method used to estimate the coefficients of a linear regression model. It is a modification of the Ordinary Least Squares (OLS) method, where each data point is given a weight based on its reliability or importance in the model.

2. When should WLS be used instead of OLS?

WLS should be used when the data has unequal variances or when there is heteroscedasticity present. This means that the variability of the data points is not constant, and WLS can provide more accurate estimates of the coefficients by giving more weight to the more reliable data points.

3. How are the weights determined in WLS?

The weights in WLS are typically determined by the inverse of the variance of each data point. This means that data points with higher variance (less reliable) will have lower weights, while data points with lower variance (more reliable) will have higher weights. Other methods for determining weights may also be used, such as expert judgement or empirical approaches.

4. What is the difference between WLS and other regression methods like Ridge or Lasso?

WLS is a method used for estimating coefficients in a linear regression model, while Ridge and Lasso are methods used for dealing with overfitting in regression models. WLS takes into account the variability of the data points, while Ridge and Lasso aim to reduce the complexity of the model by penalizing large coefficients. These methods can also be used in conjunction with each other.

5. Are there any assumptions associated with WLS?

Yes, WLS assumes that the errors in the data are normally distributed and that the variability of the data points is proportional to the square of the weights. It also assumes that the weights are known and accurately reflect the reliability of the data points. Violations of these assumptions can affect the accuracy of the coefficient estimates.

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
974
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
971
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
Replies
2
Views
925
Back
Top