pull out p-values and r-squared from a linear regr

2019-01-01 10:03发布

How do you pull out the p-value (for the significance of the coefficient of the single explanatory variable being non-zero) and R-squared value from a simple linear regression model? For example...

x = cumsum(c(0, runif(100, -1, +1)))
y = cumsum(c(0, runif(100, -1, +1)))
fit = lm(y ~ x)
summary(fit)

I know that summary(fit) displays the p-value and R-squared value, but I want to be able to stick these into other variables.

标签: r
11条回答
情到深处是孤独
2楼-- · 2019-01-01 10:34

While both of the answers above are good, the procedure for extracting parts of objects is more general.

In many cases, functions return lists, and the individual components can be accessed using str() which will print the components along with their names. You can then access them using the $ operator, i.e. myobject$componentname.

In the case of lm objects, there are a number of predefined methods one can use such as coef(), resid(), summary() etc, but you won't always be so lucky.

查看更多
泪湿衣
3楼-- · 2019-01-01 10:42

r-squared: You can return the r-squared value directly from the summary object summary(fit)$r.squared. See names(summary(fit)) for a list of all the items you can extract directly.

Model p-value: If you want to obtain the p-value of the overall regression model, this blog post outlines a function to return the p-value:

lmp <- function (modelobject) {
    if (class(modelobject) != "lm") stop("Not an object of class 'lm' ")
    f <- summary(modelobject)$fstatistic
    p <- pf(f[1],f[2],f[3],lower.tail=F)
    attributes(p) <- NULL
    return(p)
}

> lmp(fit)
[1] 1.622665e-05

In the case of a simple regression with one predictor, the model p-value and the p-value for the coefficient will be the same.

Coefficient p-values: If you have more than one predictor, then the above will return the model p-value, and the p-value for coefficients can be extracted using:

summary(fit)$coefficients[,4]  

Alternatively, you can grab the p-value of coefficients from the anova(fit) object in a similar fashion to the summary object above.

查看更多
人气声优
4楼-- · 2019-01-01 10:43

Notice that summary(fit) generates an object with all the information you need. The beta, se, t and p vectors are stored in it. Get the p-values by selecting the 4th column of the coefficients matrix (stored in the summary object):

summary(fit)$coefficients[,4] 
summary(fit)$r.squared

Try str(summary(fit)) to see all the info that this object contains.

Edit: I had misread Chase's answer which basically tells you how to get to what I give here.

查看更多
还给你的自由
5楼-- · 2019-01-01 10:44

Another option is to use the cor.test function, instead of lm:

> x <- c(44.4, 45.9, 41.9, 53.3, 44.7, 44.1, 50.7, 45.2, 60.1)
> y <- c( 2.6,  3.1,  2.5,  5.0,  3.6,  4.0,  5.2,  2.8,  3.8)

> mycor = cor.test(x,y)
> mylm = lm(x~y)

# r and rsquared:
> cor.test(x,y)$estimate ** 2
      cor 
0.3262484 
> summary(lm(x~y))$r.squared
[1] 0.3262484

# P.value 

> lmp(lm(x~y))  # Using the lmp function defined in Chase's answer
[1] 0.1081731
> cor.test(x,y)$p.value
[1] 0.1081731
查看更多
泛滥B
6楼-- · 2019-01-01 10:46

I cam across this question while exploring suggested solutions for a similar problem; I presume that for future reference it may be worthwhile to update the available list of answer with a solution utilising the broom package.

Sample code

x = cumsum(c(0, runif(100, -1, +1)))
y = cumsum(c(0, runif(100, -1, +1)))
fit = lm(y ~ x)
require(broom)
glance(fit)

Results

>> glance(fit)
  r.squared adj.r.squared    sigma statistic    p.value df    logLik      AIC      BIC deviance df.residual
1 0.5442762     0.5396729 1.502943  118.2368 1.3719e-18  2 -183.4527 372.9055 380.7508 223.6251          99

Side notes

I find the glance function useful as it neatly summarises the useful values. As an added benefit the results are stored as a data.frame which makes further manipulation easy:

>> class(glance(fit))
[1] "data.frame"
查看更多
回忆,回不去的记忆
7楼-- · 2019-01-01 10:48

Extension of @Vincent 's answer:

For lm() generated models:

summary(fit)$coefficients[,4]   ##P-values 
summary(fit)$r.squared          ##R squared values

For gls() generated models:

summary(fit)$tTable[,4]         ##P-values
##R-squared values are not generated b/c gls uses max-likelihood not Sums of Squares

To isolate an individual p-value itself, you'd add a row number to the code:

For example to access the p-value of the intercept in both model summaries:

summary(fit)$coefficients[1,4]
summary(fit)$tTable[1,4]  
  • Note, you can replace the column number with the column name in each of the above instances:

    summary(fit)$coefficients[1,"Pr(>|t|)"]  ##lm 
    summary(fit)$tTable[1,"p-value"]         ##gls 
    

If you're still unsure of how to access a value form the summary table use str() to figure out the structure of the summary table:

str(summary(fit))
查看更多
登录 后发表回答