Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Determine optimal vectors for least squares

  1. May 16, 2012 #1
    Hello all,

    I have a set of measurements that I want to fit to a linear model with a bunch of parameters. I do this as [itex]\theta=(H^TH)^{-1}H^Tx[/itex], where θ are the parameters and x is the set of measurements. The problem is that I'd like to reduce the number of parameters in the fit. I'd like to chose the subset of N parameters that gives the best fit, such that no other combination of N parameters works better.

    Is there any way I can determine what is the best subset of N parameters without having to try all of them? I have seen that with order recursive least squares I can add parameters sequentially to improve the fit, but this approach does not guarantee that the N parameters that I have selected are the best combination.

    Thank you very much for any help,
     
  2. jcsd
  3. May 16, 2012 #2

    chiro

    User Avatar
    Science Advisor

    Hey Pete99.

    If you want to choose the bit fit for say N parameters where N <= x but greater than zero, then it sounds like you need to find a way to either project your inputs down to an appropriate sub-space and do least-squares on that, or to do the other thing which is to do least squares and then project the result down from your calculated parameters down to a reduced form.

    In other words, this boils down to taking your vector x and projecting it down to some sub-space in the sam way we project say an arbitrary point in R^3 on to a plane that is two-dimensional.

    The thing you will have to figure out in terms of the nitty gritty is the actual projection itself and this will depend on specifically what you call an 'optimal' configuration of parameters.

    I would start off thinking about doing the least squares and then projecting your parameters down to some sub-space instead of taking your vector and projecting that down before you do least squares.

    If you are trying to fit a linear model to data like you would in a statistical analysis though (like a regression), I would not do this method but instead use what is called a Principle Components Analysis.

    PCA is a very old technique and well understood and comes as a feature or a library in manys statistical applications. It works by creating a basis of un-correlated variables in the order of the basis vectors that contribute the most variance up until the least amount of variance.

    Thus if you want a model for N parameters, you pick the first N basis components of the PCA output and use these basis vectors as your regression model.

    I'd strongly recommend you think about using PCA if you are trying to fit some multi-dimensional linear model because the calculation is very quick and you can fit the linear model just as quick and see how good the fit is for yourself.

    In R it should take about say 30 minutes to an hour if you are familiar with R and more if you are not, but if you are familiar with the major packages then you could probably just read the documentation for this.
     
  4. May 16, 2012 #3
    Thank you chiro for your your detailed response.

    Sorry for the notation if it is not very rigorous, and correct me if something I say is directly wrong. I guess I forgot to mention that the vectors (columns from H) that I want to use are already defined and have some physical meaning. As far as I know PCA does not allow me to use these vectors but the idea is exactly what I want to do.

    I would like to chose the a subset of N columns from the matrix H (say [itex]H_N[/itex]), such that no other subset of N columns from the matrix H gives a better fit in the least squares sense.

    In PCA I would get a set of orthonormal vectors ([itex]v_1,v_2,\dots[/itex]) that I can use to do my fit. And since they are orthogonal, if [itex]H_1=v_1[/itex] is the best "single-vector" that I can use to fit my data, the best "two-vectors" to represent the data will be [itex]H_2=[v_1,v_2][/itex], etc...

    In my case, since the vectors in H are not orthogonal, assuming that the best "single-vector" is [itex]H=h_1[/itex], there is no guarantee that the best "two-vectors" will contain [itex]h_1[/itex], but they can be any other two vectors (for instance [itex]H_2=[h_3,h_{24}][/itex]). My problem is that I don't know how to chose these two vectors unless I try all possible combinations of two vectors.
     
  5. May 16, 2012 #4

    chiro

    User Avatar
    Science Advisor

    So what is the criteria exactly? Do you want to say rank some variables over another in the selection process? So for example you always want a model that capture a particular kind of variable even if it doesn't contribute much to the actual regression model?

    Also what you can do is to take a variable out when you do the PCA and see what it produces and then look at what has been calculated as part of the output components.

    Also there are routines that do find the best fit of variables for a regression given N variables that are exhaustive in contrast to the PCA approach. You might want to look at things like say the step() routine in R and other similar kinds of routines.
     
  6. May 16, 2012 #5
    Not exactly. I do want to use the variables that contribute the most to the actual fit. But I want to chose the parameters from a set of parameters that have some physical meaning in my problem.

    Let's say that my model has three physical parameters that contribute to the output as [itex]x=[h_1 h_2 h_3][\theta_1 \theta_2 \theta_2]^T[/itex], where h are column vectors and theta are the parameters.

    Say that my measurement is the vector [itex]x=[1, 1, 0]^T[/itex]. And that the h vectors in my model are [itex]h_1=[1, 0, 0]^T[/itex], [itex]h_2=[0, 1, 0]^T[/itex], and [itex]h_3=[0.9, 0.9, 0.1]^T[/itex].

    If I want to use only 1 parameter from the 3 possible parameters that I have, I would chose [itex]\theta_3[/itex], because the vector [itex]h_3[/itex] is very close to [itex]x[/itex]. This is very easy to find, because I just have to try the three possibilities. However, if I want to use two parameters, the best choice will be [itex]\theta_1[/itex] and [itex]\theta_2[/itex], since the vector x is in the plane formed by h_1 and h_2.

    In my problem I have ~25 parameters, and I would like to use no more than ~10 to fit the data (because of restrictions on the processing that I have to do later). My problem is, how can I chose the 8 parameters from the total of 25 parameters that will provide the best fit to my data in the least squares sense.

    I am not familiar with R, so I am not sure what step() does, but I will take a look to see if it can help me.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Determine optimal vectors for least squares
  1. Least Square Solution (Replies: 2)

Loading...