How to interpret the intercept term in a GLM?

I am using R and I have been analysing my data with GLM with Binomial link.

I want to know what is the meaning of the intercept in the output table. The intercept for one of my models is significantly different, however the variable is not. What does this mean?

What is the intercept. I don’t know if I am just confusing myself but having searched the internet, there is nothing just saying, it is this, take notice of it…or don’t.

Please help, a very frustrated student


glm(formula = attacked_excluding_app ~ treatment, family = binomial, 
    data = data)
Deviance Residuals: 
    Min       1Q   Median       3Q      Max  
-2.3548   0.3593   0.3593   0.3593   0.3593  
Coefficients:
                         Estimate Std. Error z value Pr(>|z|)   
(Intercept)                 2.708      1.033   2.622  0.00874 **
treatmentshiny_non-shiny    0.000      1.461   0.000  1.00000

(Dispersion parameter for binomial family taken to be 1)
Null deviance: 14.963  on 31  degrees of freedom
Residual deviance: 14.963  on 30  degrees of freedom
(15 observations deleted due to missingness)
AIC: 18.963
Number of Fisher Scoring iterations: 5

Answer

The intercept term is the intercept in the linear part of the GLM equation, so your model for the mean is E[Y]=g1(Xβ), where g is your link function and Xβ is your linear model. This linear model contains an “intercept term”, i.e.:

Xβ=c+X1β1+X2β2+

In your case the intercept is significantly non-zero, but the variable is not, so it is saying that

Xβ=c0

Because your link function is binomial, then

g(μ)=ln(μ1μ)

And so with just the intercept term, your fitted model for the mean is:

E[Y]=11+ec

You can see that if c=0 then this corresponds to simply a 50:50 chance of getting Y=1 or 0, i.e. E[Y]=11+1=0.5

So your result is saying that you can’t predict the outcome, but one class (1’s or 0’s) is more likely than the other.

Attribution
Source : Link , Question Author : Samuel Waldron , Answer Author : Scortchi – Reinstate Monica

Leave a Comment