Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Linear regression and probability distribution

  1. Jul 2, 2017 #1
    I have some data that I want to do simple linear regression on.
    However I don't have a lot of datapoints, and would like to know the uncertainty in the parameters. I.e. the slope and the intercept of the linear regression model.

    I know it should be possible to get a prob. distribution of the parameters using bayesian linear regression, but I don't know quite how to use it and what assumptions has to be made. Can someone help me how to find reasonable error estimates for my model?
     
  2. jcsd
  3. Jul 2, 2017 #2

    MarneMath

    User Avatar
    Education Advisor

    What's wrong with a confidence interval on the parameters?

    if you wish to use a Bayesian approach then it's no different than how one normally approaches these problems. Assume a prior, compute a posterior form a credible interval.
     
  4. Jul 2, 2017 #3

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    Like Marne said, most programs will include the margin of error and distribution of each of both the independent term and the coefficient in the parameters when you do a regression.
     
  5. Jul 3, 2017 #4
    Sound like what Im after. Any Python library that has this ability?
     
  6. Jul 3, 2017 #5

    MarneMath

    User Avatar
    Education Advisor

    statmodels should have it, but it's been awhile since I've used it. Although i'll also say it's relatively trivial to figure it out on your own and write your own function to do it. I also imagine sklearn should have this feature (if it doesn't then wtf)
     
  7. Jul 3, 2017 #6
    Sklearn does not have it :( statsmodels does, but api seems kind of lame.

    Think numpy.polyfit with cov=True will do the trick :cool:
     
  8. Jul 3, 2017 #7

    MarneMath

    User Avatar
    Education Advisor

    I'm not sure if that'll actually do the trick since all that'll do is return the standard error.

    Code (Text):

    import numpy as np, statsmodels.api as sm

    nsample = 100
    x = np.linspace(0, 10, nsample)
    X = np.column_stack((x, x**2))
    beta = np.array([1, 0.1, 10])
    e = np.random.normal(size=nsample)

    X = sm.add_constant(X)
    y = np.dot(X, beta) + e

    mod = sm.OLS(y, X)
    res = mod.fit()

    print(res.summary())
    print(res.conf_int(0.01))
     
    Returns the following

    Code (Text):



                               OLS Regression Results                          
    ==============================================================================
    Dep. Variable:                      y   R-squared:                       1.000
    Model:                            OLS   Adj. R-squared:                  1.000
    Method:                 Least Squares   F-statistic:                 5.402e+06
    Date:                Mon, 03 Jul 2017   Prob (F-statistic):          1.70e-245
    Time:                        09:34:54   Log-Likelihood:                -131.77
    No. Observations:                 100   AIC:                             269.5
    Df Residuals:                      97   BIC:                             277.4
    Df Model:                           2                                        
    Covariance Type:            nonrobust                                        
    ==============================================================================
                    coef    std err          t      P>|t|      [0.025      0.975]
    ------------------------------------------------------------------------------
    const          0.9008      0.270      3.338      0.001       0.365       1.436
    x1             0.0988      0.125      0.792      0.430      -0.149       0.346
    x2             9.9997      0.012    828.551      0.000       9.976      10.024
    ==============================================================================
    Omnibus:                        0.170   Durbin-Watson:                   1.765
    Prob(Omnibus):                  0.918   Jarque-Bera (JB):                0.274
    Skew:                          -0.092   Prob(JB):                        0.872
    Kurtosis:                       2.821   Cond. No.                         144.
    ==============================================================================

    Warnings:
    [1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
    [[  0.19176405   1.60988065]
     [ -0.22893401   0.42648449]
     [  9.96797785  10.03139905]]
     
    Which, in my opinion, is much more helpful than just the standard errors.
     
  9. Jul 3, 2017 #8
    Ah. Okay. But am I right in that the two coeficients in my model has a joint multivariate normal distribution. So to get a 95% confidence interval, I suppose I have to integrate out the other coeficient that I am not looking at(And not naively just add/subtract 1.96std.devs) ?
     
  10. Jul 3, 2017 #9

    MarneMath

    User Avatar
    Education Advisor

    The estimates of the coefficients will actually follow a t-distribution with n-2 degrees of freedom. This is of course assuming that the assumption of linear regressions have been met.
     
  11. Jul 3, 2017 #10
    Thanks! Bit rusty on the ol' central limit theorem... So, since I have two coeficients and numpy.polyfit returns a 2x2 covariance matrix, I guess I am looking at a multivariate t-distribution with n-2 degrees of freedom and the covariance matrix returned by the function .. ?
     
  12. Jul 3, 2017 #11

    MarneMath

    User Avatar
    Education Advisor

    You're literally just getting the covariance matrix. There's nothing special about that, the standard way you get the covariance matrix is all the function is doing, nothing about distributions. Thus I keep pointing you towards a confidence interval. The way you figure out confidence intervals of any estimate is basically what you do with the parameters being estimated here. I'm not entirely sure what you don't get about the process.
     
  13. Jul 3, 2017 #12
    I need to be able to sample from the distribution of the regression coeficients, eventually. So I try to understand what pdf to use.
     
  14. Jul 3, 2017 #13

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    Excel itself has it, standard Excel. And besides, Excel has free add-ins for a lot of other stuff, like PCA, Factor Analysis, examples, etc. I know a lot of people look down at it, but I think it is mostly snobbery.
     
  15. Jul 3, 2017 #14
    Excel is nice, but I'm looking to use this as a means to an end in a different project, which is python based :)
     
  16. Jul 3, 2017 #15

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    Can't you call Excel from Python or maybe export the Excel output analysis?
     
  17. Jul 3, 2017 #16
    Excel is not free and the data is user specified so the script should be able to do the numbercrunching itself.
    I found that the statsmodels package works well for getting confidence intervals.But I still have some trouble understanding how it works and how I can sample from the distribution of the regression parameters ^^
     
  18. Jul 3, 2017 #17

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    I don't think it samples from the parameters; the parameters have distributions that are know a priori ( derived theoretically) , and the specific values for the confidence intervals are arrived at through the data you enter. I think Excel Online is available for free, though don't mean to push it.
     
  19. Jul 3, 2017 #18

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    Python and other languages with scripting capabilities (I like Perl) allow you to write top-level scripts that can generate programs for other languages (Excel, R, MATLAB, etc.), run the program, parse the output for results, and continue to the next step. A large project might have a lot of top-level Python scripts that invoke several other programs or tools that are best for specific subtasks.
     
  20. Jul 3, 2017 #19

    MarneMath

    User Avatar
    Education Advisor

    The sampling of the regression coefficients converge asymptotically to a normal distribution. If I may ask, why do you want to sample the parameters?
     
  21. Jul 4, 2017 #20
    I got the confidence intervals to work with statsmodels. Need to figure out how to get prediction intervals as well, though.

    The reason I want to be able to sample, is that the regression coeficients obeys some (potentially complex) physics equation with several other parameters (That also have uncertainties). Want to be able to see how these uncertainties propagate. I.e. if a and b has some uncertainty, what are the possible values of c, which also is in the equation. One easy way is to monte carlo sample a and b and create a histogram over possible c values
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Linear regression and probability distribution
  1. Linear regression (Replies: 7)

Loading...